In this webinar, we will share findings and insights from the maturity scorecards we have completed with the world's leading retailers, how they used this to secure executive sponsorship to ensure the data technology and business requirements were in tandem, as well as the use cases typically pursued. We will discuss the typical organizational constructs we see applicable based on the different stages of maturity and also discuss some best practices for driving best in class process for data driven transformation.
8 Decimal capital attended Security Token Realised Conference in London on the 23rd-24th of January 2019 and shared our perspectives on the topic of Investing in the Security Token Ecosystem. Our presentation was focused on the topic of Investing in the Security Token Ecosystem, in which we are sharing some of our thoughts on the space regarding the benefits of security tokens, the evolution of the Security Token hype cycle & liquidity, our investment criteria and our thoughts on what type of projects would have less friction in terms of adoption. Additionally, we provided an explanation on why we invested into Securitize, Open Finance and Bibox Bond, as well as a Security Token project we are currently working on with Fuhua Group, tokenizing $1B dollar worth of buildings & hotels.
Thomas Lamirault_Mohamed Amine Abdessemed -A brief history of time with Apac...Flink Forward
Many use cases in the telecommunication industry require producing counters, quality metrics, and alarms in a streaming fashion with very low latency. Most of this metrics are only valuable when they’re made available as soon as the associated events happened. In our company we are looking for a system able to produce this kind of real-time indicator, which must handle massive amounts of data (400,000 eps) with often peak loads (like New Year’s Eve) or out-of-order events like massive network disorder. Low latency and flexible window management with specific watermark emission are also a must-haves. Heterogeneous format, multiple flow correlation, and the possibility of late data arrival are other challenges. Flink being already widely used at Bouygues Telecom for real-time data integration, its features made it the evident candidate for the future System. In this talk, we'll present a real use case of streaming analytics using Flink, Kafka & HBase along with other legacy systems.
A merchant's guide to the steps that should be followed to build a merchandise category financial strategic plan.
Mech/ Merchandise /Revenue/Gross Profit/Margin/Dollars/Data/Ecommerce/Omnichannel/Retail
Webinar: Detecting row patterns with Flink SQL - Dawid WysakowiczVerverica
Apache Flink is one of the first open source stream processors that was able to address the full spectrum of stream processing applications, ranging from applications with low latency requirements to applications that process millions of events per second. On top of this powerful processing engine, the Flink community built APIs for complex event processing and streaming analytics, namely the CEP library and support for streaming SQL.
Since recently, the Flink community is integrating both APIs by extending Flink SQL to support the MATCH RECOGNIZE clause for row pattern matching that was introduced with the SQL:2016 standard.
I will discuss the new MATCH RECOGNIZE feature and present use cases that benefit from pattern matching support in streaming SQL, such as process monitoring or anomaly detection. I will demonstrate the feature with a few example queries.
(Jason Gustafson, Confluent) Kafka Summit SF 2018
Kafka has a well-designed replication protocol, but over the years, we have found some extremely subtle edge cases which can, in the worst case, lead to data loss. We fixed the cases we were aware of in version 0.11.0.0, but shortly after that, another edge case popped up and then another. Clearly we needed a better approach to verify the correctness of the protocol. What we found is Leslie Lamport’s specification language TLA+.
In this talk I will discuss how we have stepped up our testing methodology in Apache Kafka to include formal specification and model checking using TLA+. I will cover the following:
1. How Kafka replication works
2. What weaknesses we have found over the years
3. How these problems have been fixed
4. How we have used TLA+ to verify the fixed protocol.
This talk will give you a deeper understanding of Kafka replication internals and its semantics. The replication protocol is a great case study in the complex behavior of distributed systems. By studying the faults and how they were fixed, you will have more insight into the kinds of problems that may lurk in your own designs. You will also learn a little bit of TLA+ and how it can be used to verify distributed algorithms.
8 Decimal capital attended Security Token Realised Conference in London on the 23rd-24th of January 2019 and shared our perspectives on the topic of Investing in the Security Token Ecosystem. Our presentation was focused on the topic of Investing in the Security Token Ecosystem, in which we are sharing some of our thoughts on the space regarding the benefits of security tokens, the evolution of the Security Token hype cycle & liquidity, our investment criteria and our thoughts on what type of projects would have less friction in terms of adoption. Additionally, we provided an explanation on why we invested into Securitize, Open Finance and Bibox Bond, as well as a Security Token project we are currently working on with Fuhua Group, tokenizing $1B dollar worth of buildings & hotels.
Thomas Lamirault_Mohamed Amine Abdessemed -A brief history of time with Apac...Flink Forward
Many use cases in the telecommunication industry require producing counters, quality metrics, and alarms in a streaming fashion with very low latency. Most of this metrics are only valuable when they’re made available as soon as the associated events happened. In our company we are looking for a system able to produce this kind of real-time indicator, which must handle massive amounts of data (400,000 eps) with often peak loads (like New Year’s Eve) or out-of-order events like massive network disorder. Low latency and flexible window management with specific watermark emission are also a must-haves. Heterogeneous format, multiple flow correlation, and the possibility of late data arrival are other challenges. Flink being already widely used at Bouygues Telecom for real-time data integration, its features made it the evident candidate for the future System. In this talk, we'll present a real use case of streaming analytics using Flink, Kafka & HBase along with other legacy systems.
A merchant's guide to the steps that should be followed to build a merchandise category financial strategic plan.
Mech/ Merchandise /Revenue/Gross Profit/Margin/Dollars/Data/Ecommerce/Omnichannel/Retail
Webinar: Detecting row patterns with Flink SQL - Dawid WysakowiczVerverica
Apache Flink is one of the first open source stream processors that was able to address the full spectrum of stream processing applications, ranging from applications with low latency requirements to applications that process millions of events per second. On top of this powerful processing engine, the Flink community built APIs for complex event processing and streaming analytics, namely the CEP library and support for streaming SQL.
Since recently, the Flink community is integrating both APIs by extending Flink SQL to support the MATCH RECOGNIZE clause for row pattern matching that was introduced with the SQL:2016 standard.
I will discuss the new MATCH RECOGNIZE feature and present use cases that benefit from pattern matching support in streaming SQL, such as process monitoring or anomaly detection. I will demonstrate the feature with a few example queries.
(Jason Gustafson, Confluent) Kafka Summit SF 2018
Kafka has a well-designed replication protocol, but over the years, we have found some extremely subtle edge cases which can, in the worst case, lead to data loss. We fixed the cases we were aware of in version 0.11.0.0, but shortly after that, another edge case popped up and then another. Clearly we needed a better approach to verify the correctness of the protocol. What we found is Leslie Lamport’s specification language TLA+.
In this talk I will discuss how we have stepped up our testing methodology in Apache Kafka to include formal specification and model checking using TLA+. I will cover the following:
1. How Kafka replication works
2. What weaknesses we have found over the years
3. How these problems have been fixed
4. How we have used TLA+ to verify the fixed protocol.
This talk will give you a deeper understanding of Kafka replication internals and its semantics. The replication protocol is a great case study in the complex behavior of distributed systems. By studying the faults and how they were fixed, you will have more insight into the kinds of problems that may lurk in your own designs. You will also learn a little bit of TLA+ and how it can be used to verify distributed algorithms.
Understanding Proof of Work (PoW) and Proof of Stake (PoS) AlgorithmsGautam Anand
We will focus on understanding "Proof of Stake (PoS)" Algorithm, how it different from "Proof of Work" algorithm, the performance benefits and security overview. We will also discuss the upcoming blockchain protocols that are planning to move to PoS.
"You can download this product from SlideTeam.net"
Use our content ready Strategic Portfolio Management PowerPoint Presentation Slides to showcase assets management of various securities in order to meet Investment goals. Investment management strategy PowerPoint complete deck comprises of professional slides such as objectives of portfolio management, types of investments, market scenario overview, investment instruments, securities portfolio, analysis and valuation of equity securities, industry analysis PESTEL, SWOT analysis, discounted cash flow method, financial statement analysis, company cash flow statement, investment in special situations, fixed income and leveraged securities, bond valuation system, reinvestment risk table, type of convertible securities, options analysis, warrants summarization overview, derivative products, put and call options, stock index futures and options, stick indexes comparison table, broaden the investment perspective, international security market highlights, global market trends, mutual funds investment criteria overview, investment in real estate, diversified real estate classification, KPIs and dashboards etc. Download investment portfolio management PPT visuals to analyze risk and return on investment. https://bit.ly/3sauHXW
Apache Kafka in Financial Services - Use Cases and ArchitecturesKai Wähner
The Rise of Event Streaming in Financial Services - Use Cases, Architectures and Examples powered by Apache Kafka.
The New FinServ Enterprise Reality: Every company is a software company. Innovate OR be Disrupted. Learn how Event Streaming with Apache Kafka and its ecosystem help...
More details:
https://www.kai-waehner.de/apache-kafka-financial-services-industry-banking-finserv-payment-fraud-middleware-messaging-transactions
https://www.kai-waehner.de/blog/2020/04/15/apache-kafka-machine-learning-banking-finance-industry/
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Optimising Geospatial Queries with Dynamic File PruningDatabricks
One of the most significant benefits provided by Databricks Delta is the ability to use z-ordering and dynamic file pruning to significantly reduce the amount of data that is retrieved from blob storage and therefore drastically improve query times, sometimes by an order of magnitude.
The Need For Effective Early Engagement In Solution Architecture And DesignAlan McSweeney
Early engagement in the solution delivery process needs to occur before any solution delivery project is initiated. Its objective is to understand the scope, requirements, objectives, approach, options and to get a high-level understanding of the likely resources, timescale and cost required before starting the project
Fundamentally, early engagement is about managing risk:
• Risk of doing the wrong thing
• Risk of doing it in the wrong way
• Risk of underestimating complexity and scope of work
• Risk of higher than expected cost of operation and maintenance
• Risk of underestimating organisation change impact and organisation resistance
• Risk through uncertainty
Early engagement is not a requirements gathering exercise. Traditional requirements gathering requires substantial initial effort, resources and cost and for the business to commit without doubts, uncertainties and ambiguities being known.
Early Engagement involves taking a not necessarily well-defined request from the business and creating an unambiguous set of solution options including their delivery and operation quickly and accurately
This paper describes an approach to early engagement in the solution definition process.
Apache Arrow is a new standard for in-memory columnar data processing. It is a complement to Apache Parquet and Apache ORC. In this deck we review key design goals and how Arrow works in detail.
Across the globe energy systems are changing, creating unprecedented challenges for the organisations tasked with ensuring the lights stay on. In the UK, National Grid is facing shrinking margins, looming capacity shortages and unpredictable peaks and troughs in energy supply caused by increasing levels of renewable penetration. Open Energi uses its IoT technology to unlock demand-side capacity - from industrial equipment, co-generation and batery storage systems - creating a smarter grid; one that is cleaner, cheaper, more secure and more efficient.
I'll talk about how we use Apache Nifi to orchestrate and coordinate Machine Learning microservices that operate on streams of data coming from IoT devices, providing a layer of fault-tolerance and traceability. With built-in retry logic, backpressure and clustering, Nifi helps us keep hard problems away from our code. It comes with processors that integrate with our cloud provider of choice (Microsoft Azure), fitting seamlessly into our processing pipeline.Finally, its straightforward graphical interface makes it easy enough to use that any team member can step in and troubleshoot a flow with little training.
Building real time analytics applications using pinot : A LinkedIn case studyKishore Gopalakrishna
LinkedIn's is the most advantageous social networking tool available to job seekers and business professionals today, with 610+ million members creating millions of posts, videos, and articles that generate tens of millions of shares, comments, and likes per day. LinkedIn has leveraged this activity data to build rich interactive user-facing analytics applications like “Who Viewed My Profile”, Talent Insights, Ad Analytics, and Publisher Analytics, among others. These applications are all powered by Pinot, as are internal dashboards, anomaly detection and root cause analysis platform like ThirdEye. This talk will present how Pinot has become the de-facto solution for serving analytic queries in milliseconds, ad-hoc reporting, monitoring & Anomaly Detection on multidimensional data.
Re-imagine Data Monitoring with whylogs and SparkDatabricks
In the era of microservices, decentralized ML architectures and complex data pipelines, data quality has become a bigger challenge than ever. When data is involved in complex business processes and decisions, bad data can, and will, affect the bottom line. As a result, ensuring data quality across the entire ML pipeline is both costly, and cumbersome while data monitoring is often fragmented and performed ad hoc. To address these challenges, we built whylogs, an open source standard for data logging. It is a lightweight data profiling library that enables end-to-end data profiling across the entire software stack. The library implements a language and platform agnostic approach to data quality and data monitoring. It can work with different modes of data operations, including streaming, batch and IoT data.
In this talk, we will provide an overview of the whylogs architecture, including its lightweight statistical data collection approach and various integrations. We will demonstrate how the whylogs integration with Apache Spark achieves large scale data profiling, and we will show how users can apply this integration into existing data and ML pipelines.
Getting Started with Databricks SQL AnalyticsDatabricks
It has long been said that business intelligence needs a relational warehouse, but that view is changing. With the Lakehouse architecture being shouted from the rooftops, Databricks have released SQL Analytics, an alternative workspace for SQL-savvy users to interact with an analytics-tuned cluster. But how does it work? Where do you start? What does a typical Data Analyst’s user journey look like with the tool?
This session will introduce the new workspace and walk through the various key features – how you set up a SQL Endpoint, the query workspace, creating rich dashboards and connecting up BI tools such as Microsoft Power BI.
If you’re truly trying to create a Lakehouse experience that satisfies your SQL-loving Data Analysts, this is a tool you’ll need to be familiar with and include in your design patterns, and this session will set you on the right path.
8 Decimal Capital Security Token Industry OverviewKadeemClarke3
8 Decimal Capital, a leading fund in the blockchain venture capital space, has begun focusing on security tokens (STs) and security token offerings (STOs). We believe this new technology will revolutionize the financial industry and how assets are managed and traded.
في هذه المحاضرة تحدثت عن التغيير المتوقع حدوثه في النسخة الجديدة للدليل المعرفي لإدارة المشاريع
PMBOK 7th Edition
حيث أعطيت نبذة مختصرة عن النسخ السابقة للدليل المعرفي لإدارة المشاريع ثم تحدثت عن سبب التغيير للنسخة الجديدة وما هي أسباب هذا التغيير.
بعدها تطرقت للتغيير الذي تم من الدليل المعرفي لإدارة المشاريع النسخة السادسة إلى الدليل المعرفي لإدارة المشاريع النسخة السابعة المتوقع صدورها في الربع الرابع من العام 2020
وضحت بالتفصيل التغيير الذي تم على
Standard of the Project Management
وأيضا التغيير الذي تم على
Guide of the Project Management Body of Knowledge
حيث يعتبر هذا التغيير تاريخي بتحول الدليل المعرفي لإدارة المشاريع النسخة السابعة معتمدا على
Principled Based
بديلا عن
Processed Based
مما استدعى ابعاد
Process Groups, Knowledge areas and ITTO
بالكامل في الدليل المعرفي لإدارة المشاريع النسخة السابعة وذلك لكي يكون مناسبا للاستفادة من كل العاملين في إدارة المشاريع بغض النظر عن الطريقة التي سيديرون بها مشاريعهم سواء كانت
Waterfall or Agile or Design Thinking or Lean Startup or Kanban or Hybrid or any approaches
وأيضا تحدثت عن المنصة الرقمية الجديدة التي سيتم نقل كل ما يسهل الممارسة العملية في إدارة المشاريع وربطها بكل ما صدر من معهد إدارة المشاريع
PMI Digital Content Platform: Standards Plus™
يمكنك الاطلاع على المحاضرة على قناتي على اليوتيوب على هذا الرابط:
https://youtu.be/DGaaLKBJMAA
White Star Capital Germany Venture Capital Landscape 2020JeandeLencquesaing
We are pleased to publish the second edition of our German Venture Capital report and hope you will enjoy reading it. 2019 was a year where Germany has really played to its strengths and cemented its position as one of the European leaders in tech venture capital, and we are more excited than ever about the development of this ecosystem.
Our report unpacks the current progress and outlook for the German ecosystem using our ecosystem model to highlight Germany’s unique positioning in an increasingly global playing field for startups.
So what did we find?
- Germany had a record year in VC reaching $5.7bn in funding with 49% yoy growth, the second best funded country in Europe
- Germany leverages its global industrial leadership to retain its place as the top destination for European mobility VC investment in 2019, reaching $1.3bn in funding, representing 26% of total funding. Its corporate strengths also drive investments in fintech and B2B software, representing 23% and 20% of total funding, respectively.
- Corporate Venture Capital plays a key role and participates in 58% of the total funding, the highest level worldwide. Next47 (Siemens), IFB Hamburg, Bosch are some of the most active German CVCs. As LPs (investors in VCs), corporates represent 28% of total German VC funds raised, the highest level in Europe, further boosting the local ecosystem
In addition to sharing our excitement about Germany and expressing our belief that the ecosystem is stronger than ever we look at robust business networks, the continued government support via entities such as KfW and the vibrant founder community.
White Star Capital has made landmark investments in Germany and seen many of the findings play out with our portfolio companies. Tier has raised Series B in 2019 led by international investors such as Mubadala, Goodwater and ourselves, while Clark has benefited from a large domestic market for insurance.
Outline of PPM, Project and Portfolio Management and it's use in Project Management disciplines
If you would like a copy of the slides, please email me
Transforming your company into a data-driven and data-aware company can be complex. Everything from knowing where to start, to executive buy-in, to grandfathered processes can slow data maturity and business growth. The journey begins with understanding the opportunities unique to your business based on your level of data maturity.
In this session, we will share findings and insights from customers, how they used this to secure executive sponsorship to ensure the data technology and business requirements were in tandem, as well as the use cases typically pursued. We will discuss the typical organizational constructs we see applicable based on the different stages of maturity and also discuss some best practices for driving best in class process for data driven transformation.
The explosion of data is catalyzing new business models and reshaping industries. No longer can you amble your way forward in the age of Big Data; the challenges are too great to address on an ad-hoc basis and the business potential too vast to simply dismiss.
What's in store for Big Data in 2015? Will the 'Internet of Things' fuel the Industrial Internet? Will Big Data get Cloudy? Check out the top five Big Data predictions for 2015 according to Quentin Gallivan, CEO, Pentah0
Understanding Proof of Work (PoW) and Proof of Stake (PoS) AlgorithmsGautam Anand
We will focus on understanding "Proof of Stake (PoS)" Algorithm, how it different from "Proof of Work" algorithm, the performance benefits and security overview. We will also discuss the upcoming blockchain protocols that are planning to move to PoS.
"You can download this product from SlideTeam.net"
Use our content ready Strategic Portfolio Management PowerPoint Presentation Slides to showcase assets management of various securities in order to meet Investment goals. Investment management strategy PowerPoint complete deck comprises of professional slides such as objectives of portfolio management, types of investments, market scenario overview, investment instruments, securities portfolio, analysis and valuation of equity securities, industry analysis PESTEL, SWOT analysis, discounted cash flow method, financial statement analysis, company cash flow statement, investment in special situations, fixed income and leveraged securities, bond valuation system, reinvestment risk table, type of convertible securities, options analysis, warrants summarization overview, derivative products, put and call options, stock index futures and options, stick indexes comparison table, broaden the investment perspective, international security market highlights, global market trends, mutual funds investment criteria overview, investment in real estate, diversified real estate classification, KPIs and dashboards etc. Download investment portfolio management PPT visuals to analyze risk and return on investment. https://bit.ly/3sauHXW
Apache Kafka in Financial Services - Use Cases and ArchitecturesKai Wähner
The Rise of Event Streaming in Financial Services - Use Cases, Architectures and Examples powered by Apache Kafka.
The New FinServ Enterprise Reality: Every company is a software company. Innovate OR be Disrupted. Learn how Event Streaming with Apache Kafka and its ecosystem help...
More details:
https://www.kai-waehner.de/apache-kafka-financial-services-industry-banking-finserv-payment-fraud-middleware-messaging-transactions
https://www.kai-waehner.de/blog/2020/04/15/apache-kafka-machine-learning-banking-finance-industry/
https://www.kai-waehner.de/blog/2020/04/24/mainframe-offloading-replacement-apache-kafka-connect-ibm-db2-mq-cdc-cobol/
Optimising Geospatial Queries with Dynamic File PruningDatabricks
One of the most significant benefits provided by Databricks Delta is the ability to use z-ordering and dynamic file pruning to significantly reduce the amount of data that is retrieved from blob storage and therefore drastically improve query times, sometimes by an order of magnitude.
The Need For Effective Early Engagement In Solution Architecture And DesignAlan McSweeney
Early engagement in the solution delivery process needs to occur before any solution delivery project is initiated. Its objective is to understand the scope, requirements, objectives, approach, options and to get a high-level understanding of the likely resources, timescale and cost required before starting the project
Fundamentally, early engagement is about managing risk:
• Risk of doing the wrong thing
• Risk of doing it in the wrong way
• Risk of underestimating complexity and scope of work
• Risk of higher than expected cost of operation and maintenance
• Risk of underestimating organisation change impact and organisation resistance
• Risk through uncertainty
Early engagement is not a requirements gathering exercise. Traditional requirements gathering requires substantial initial effort, resources and cost and for the business to commit without doubts, uncertainties and ambiguities being known.
Early Engagement involves taking a not necessarily well-defined request from the business and creating an unambiguous set of solution options including their delivery and operation quickly and accurately
This paper describes an approach to early engagement in the solution definition process.
Apache Arrow is a new standard for in-memory columnar data processing. It is a complement to Apache Parquet and Apache ORC. In this deck we review key design goals and how Arrow works in detail.
Across the globe energy systems are changing, creating unprecedented challenges for the organisations tasked with ensuring the lights stay on. In the UK, National Grid is facing shrinking margins, looming capacity shortages and unpredictable peaks and troughs in energy supply caused by increasing levels of renewable penetration. Open Energi uses its IoT technology to unlock demand-side capacity - from industrial equipment, co-generation and batery storage systems - creating a smarter grid; one that is cleaner, cheaper, more secure and more efficient.
I'll talk about how we use Apache Nifi to orchestrate and coordinate Machine Learning microservices that operate on streams of data coming from IoT devices, providing a layer of fault-tolerance and traceability. With built-in retry logic, backpressure and clustering, Nifi helps us keep hard problems away from our code. It comes with processors that integrate with our cloud provider of choice (Microsoft Azure), fitting seamlessly into our processing pipeline.Finally, its straightforward graphical interface makes it easy enough to use that any team member can step in and troubleshoot a flow with little training.
Building real time analytics applications using pinot : A LinkedIn case studyKishore Gopalakrishna
LinkedIn's is the most advantageous social networking tool available to job seekers and business professionals today, with 610+ million members creating millions of posts, videos, and articles that generate tens of millions of shares, comments, and likes per day. LinkedIn has leveraged this activity data to build rich interactive user-facing analytics applications like “Who Viewed My Profile”, Talent Insights, Ad Analytics, and Publisher Analytics, among others. These applications are all powered by Pinot, as are internal dashboards, anomaly detection and root cause analysis platform like ThirdEye. This talk will present how Pinot has become the de-facto solution for serving analytic queries in milliseconds, ad-hoc reporting, monitoring & Anomaly Detection on multidimensional data.
Re-imagine Data Monitoring with whylogs and SparkDatabricks
In the era of microservices, decentralized ML architectures and complex data pipelines, data quality has become a bigger challenge than ever. When data is involved in complex business processes and decisions, bad data can, and will, affect the bottom line. As a result, ensuring data quality across the entire ML pipeline is both costly, and cumbersome while data monitoring is often fragmented and performed ad hoc. To address these challenges, we built whylogs, an open source standard for data logging. It is a lightweight data profiling library that enables end-to-end data profiling across the entire software stack. The library implements a language and platform agnostic approach to data quality and data monitoring. It can work with different modes of data operations, including streaming, batch and IoT data.
In this talk, we will provide an overview of the whylogs architecture, including its lightweight statistical data collection approach and various integrations. We will demonstrate how the whylogs integration with Apache Spark achieves large scale data profiling, and we will show how users can apply this integration into existing data and ML pipelines.
Getting Started with Databricks SQL AnalyticsDatabricks
It has long been said that business intelligence needs a relational warehouse, but that view is changing. With the Lakehouse architecture being shouted from the rooftops, Databricks have released SQL Analytics, an alternative workspace for SQL-savvy users to interact with an analytics-tuned cluster. But how does it work? Where do you start? What does a typical Data Analyst’s user journey look like with the tool?
This session will introduce the new workspace and walk through the various key features – how you set up a SQL Endpoint, the query workspace, creating rich dashboards and connecting up BI tools such as Microsoft Power BI.
If you’re truly trying to create a Lakehouse experience that satisfies your SQL-loving Data Analysts, this is a tool you’ll need to be familiar with and include in your design patterns, and this session will set you on the right path.
8 Decimal Capital Security Token Industry OverviewKadeemClarke3
8 Decimal Capital, a leading fund in the blockchain venture capital space, has begun focusing on security tokens (STs) and security token offerings (STOs). We believe this new technology will revolutionize the financial industry and how assets are managed and traded.
في هذه المحاضرة تحدثت عن التغيير المتوقع حدوثه في النسخة الجديدة للدليل المعرفي لإدارة المشاريع
PMBOK 7th Edition
حيث أعطيت نبذة مختصرة عن النسخ السابقة للدليل المعرفي لإدارة المشاريع ثم تحدثت عن سبب التغيير للنسخة الجديدة وما هي أسباب هذا التغيير.
بعدها تطرقت للتغيير الذي تم من الدليل المعرفي لإدارة المشاريع النسخة السادسة إلى الدليل المعرفي لإدارة المشاريع النسخة السابعة المتوقع صدورها في الربع الرابع من العام 2020
وضحت بالتفصيل التغيير الذي تم على
Standard of the Project Management
وأيضا التغيير الذي تم على
Guide of the Project Management Body of Knowledge
حيث يعتبر هذا التغيير تاريخي بتحول الدليل المعرفي لإدارة المشاريع النسخة السابعة معتمدا على
Principled Based
بديلا عن
Processed Based
مما استدعى ابعاد
Process Groups, Knowledge areas and ITTO
بالكامل في الدليل المعرفي لإدارة المشاريع النسخة السابعة وذلك لكي يكون مناسبا للاستفادة من كل العاملين في إدارة المشاريع بغض النظر عن الطريقة التي سيديرون بها مشاريعهم سواء كانت
Waterfall or Agile or Design Thinking or Lean Startup or Kanban or Hybrid or any approaches
وأيضا تحدثت عن المنصة الرقمية الجديدة التي سيتم نقل كل ما يسهل الممارسة العملية في إدارة المشاريع وربطها بكل ما صدر من معهد إدارة المشاريع
PMI Digital Content Platform: Standards Plus™
يمكنك الاطلاع على المحاضرة على قناتي على اليوتيوب على هذا الرابط:
https://youtu.be/DGaaLKBJMAA
White Star Capital Germany Venture Capital Landscape 2020JeandeLencquesaing
We are pleased to publish the second edition of our German Venture Capital report and hope you will enjoy reading it. 2019 was a year where Germany has really played to its strengths and cemented its position as one of the European leaders in tech venture capital, and we are more excited than ever about the development of this ecosystem.
Our report unpacks the current progress and outlook for the German ecosystem using our ecosystem model to highlight Germany’s unique positioning in an increasingly global playing field for startups.
So what did we find?
- Germany had a record year in VC reaching $5.7bn in funding with 49% yoy growth, the second best funded country in Europe
- Germany leverages its global industrial leadership to retain its place as the top destination for European mobility VC investment in 2019, reaching $1.3bn in funding, representing 26% of total funding. Its corporate strengths also drive investments in fintech and B2B software, representing 23% and 20% of total funding, respectively.
- Corporate Venture Capital plays a key role and participates in 58% of the total funding, the highest level worldwide. Next47 (Siemens), IFB Hamburg, Bosch are some of the most active German CVCs. As LPs (investors in VCs), corporates represent 28% of total German VC funds raised, the highest level in Europe, further boosting the local ecosystem
In addition to sharing our excitement about Germany and expressing our belief that the ecosystem is stronger than ever we look at robust business networks, the continued government support via entities such as KfW and the vibrant founder community.
White Star Capital has made landmark investments in Germany and seen many of the findings play out with our portfolio companies. Tier has raised Series B in 2019 led by international investors such as Mubadala, Goodwater and ourselves, while Clark has benefited from a large domestic market for insurance.
Outline of PPM, Project and Portfolio Management and it's use in Project Management disciplines
If you would like a copy of the slides, please email me
Transforming your company into a data-driven and data-aware company can be complex. Everything from knowing where to start, to executive buy-in, to grandfathered processes can slow data maturity and business growth. The journey begins with understanding the opportunities unique to your business based on your level of data maturity.
In this session, we will share findings and insights from customers, how they used this to secure executive sponsorship to ensure the data technology and business requirements were in tandem, as well as the use cases typically pursued. We will discuss the typical organizational constructs we see applicable based on the different stages of maturity and also discuss some best practices for driving best in class process for data driven transformation.
The explosion of data is catalyzing new business models and reshaping industries. No longer can you amble your way forward in the age of Big Data; the challenges are too great to address on an ad-hoc basis and the business potential too vast to simply dismiss.
What's in store for Big Data in 2015? Will the 'Internet of Things' fuel the Industrial Internet? Will Big Data get Cloudy? Check out the top five Big Data predictions for 2015 according to Quentin Gallivan, CEO, Pentah0
The study was conducted by Avaus Marketing Innovations, a leading data-driven marketing agency, together with ISS, a leading marketing research company. We asked Swedish and Finnish CMO’s, CTO’s and COO’s to assess the state of data and analytics in their companies, in four major areas:
1. Strategy and business objectives
2. Investments
3. People, processes and leadership
4. Tools and technologies
Download the full study for free here: https://www.avaus.fi/en/state-of-analytics/
A combination of technology advances, evolving customer expectations, process evolutions (e.g., digitization), and new business models are forcing organizations to re-think their IT strategies in 2020. In the end, the decisions technology executives make can impact differentiation, growth and scale, profitability, customer satisfaction and speed-to-market. Here are some important facts to consider about digital transformation, and the core elements of success, when evaluating next steps.
10 Enterprise Analytics Trends to Watch in 2019 MicroStrategy
View insights from Forrester analyst Mike Gualtieri, Constellation Research’s Ray Wang and Doug Henschen, Ventana Research’s Mark Smith and David Menninger, IDC’s Chandana Gopal, Marcus Borba, Ronald van Loon and other top analytics and business intelligence thought leaders.
IAB Netherlands report: Report on Digital Marketing Innovation IAB Europe
With this survey, IAB Netherlands charts the digital innovation agenda of leading marketers in the Netherlands. In cooperation with Deloitte Digital we had interviews with 22 top marketers about the state of digital marketing in their organizations and we spoke about their expectations for the coming 3 years.
In partnership with IDG, our 2022 Insight Intelligent Technology™ Report examines how companies are making progress on long-term IT strategies to meet the changing, post-pandemic expectations of their businesses, their employees, and the market more broadly.
Digital transformation is fundamentally changing people’s lives and the
ways companies do business. Around the world, we’re working to develop
solutions that give time back, make us safer and healthier, and bring
significant environmental benefits. People around the world are working
hard to create a future where we’re never delayed during air travel due to
mechanical issues. Where smart buildings have ambient intelligence that
allows meeting rooms to adjust to your preferences. They’re envisioning a
world where automobile accidents are almost nonexistent, and your car
becomes a living room or office on wheels. And a world where medical
treatment is personalized based on your DNA, dramatically improving your
health and quality of life. This is what Microsoft calls the digital difference.
We asked Harvard Business Review Analytic Services to help us look at the pace of innovation
and how prepared business leaders are for this change. We also wanted to know what projects
mattered most and what industries were most receptive to and ready for change.
We were surprised by the strategy gap and encouraged by the optimism. Business leaders know
their industries are ripe for transformation, and in most cases are eager to bring the benefits of
technology to their businesses.
At Microsoft, we aim to partner with business leaders to find the digital difference they can make.
Partnering with companies of all sizes, we recognize that one big idea isn’t enough anymore.
Decades ago an innovative shoe design, a beautiful device, or smartly designed software could
lead a company to achieve market dominance for a long time. But now micro revolutions occur
every 12-18 months, so companies must be in a continual state of transformation.
We are moving into a time when rapid innovation and speed to market are more critical than ever.
This makes the partnership between humans and machines critical—when we combine people’s
ideas and creativity with advanced technology, we get digital leadership.
A business leader interviewed for the study said we need to transform “the engine of the
company.” To do this, leaders need to bring in tech and cultural changes that empower their
employees, engage customers in new ways, optimize operations, and transform products.
Rebuilding an organization around these areas creates a fully digital company that can change
ahead of its customers and competition.
See above for our H1 2014 Digital Media and Internet market update - an overview on M&A transactions, relevant public equities, and key investments in the space through the Horizon Partners lens.
Outlook on Artificial Intelligence in the Enterprise 2016Narrative Science
Based on a survey of 235 senior business executives, Narrative Science analyzed respondents' data to identify top 4 trends of artificial intelligence in the enterprise.
Similar to Big Data Maturity as a Business: A Retail Case Study (20)
Hortonworks DataFlow (HDF) 3.3 - Taking Stream Processing to the Next LevelHortonworks
The HDF 3.3 release delivers several exciting enhancements and new features. But, the most noteworthy of them is the addition of support for Kafka 2.0 and Kafka Streams.
https://hortonworks.com/webinar/hortonworks-dataflow-hdf-3-3-taking-stream-processing-next-level/
IoT Predictions for 2019 and Beyond: Data at the Heart of Your IoT StrategyHortonworks
Forrester forecasts* that direct spending on the Internet of Things (IoT) will exceed $400 Billion by 2023. From manufacturing and utilities, to oil & gas and transportation, IoT improves visibility, reduces downtime, and creates opportunities for entirely new business models.
But successful IoT implementations require far more than simply connecting sensors to a network. The data generated by these devices must be collected, aggregated, cleaned, processed, interpreted, understood, and used. Data-driven decisions and actions must be taken, without which an IoT implementation is bound to fail.
https://hortonworks.com/webinar/iot-predictions-2019-beyond-data-heart-iot-strategy/
Getting the Most Out of Your Data in the Cloud with CloudbreakHortonworks
Cloudbreak, a part of Hortonworks Data Platform (HDP), simplifies the provisioning and cluster management within any cloud environment to help your business toward its path to a hybrid cloud architecture.
https://hortonworks.com/webinar/getting-data-cloud-cloudbreak-live-demo/
Johns Hopkins - Using Hadoop to Secure Access Log EventsHortonworks
In this webinar, we talk with experts from Johns Hopkins as they share techniques and lessons learned in real-world Apache Hadoop implementation.
https://hortonworks.com/webinar/johns-hopkins-using-hadoop-securely-access-log-events/
Catch a Hacker in Real-Time: Live Visuals of Bots and Bad GuysHortonworks
Cybersecurity today is a big data problem. There’s a ton of data landing on you faster than you can load, let alone search it. In order to make sense of it, we need to act on data-in-motion, use both machine learning, and the most advanced pattern recognition system on the planet: your SOC analysts. Advanced visualization makes your analysts more efficient, helps them find the hidden gems, or bombs in masses of logs and packets.
https://hortonworks.com/webinar/catch-hacker-real-time-live-visuals-bots-bad-guys/
We have introduced several new features as well as delivered some significant updates to keep the platform tightly integrated and compatible with HDP 3.0.
https://hortonworks.com/webinar/hortonworks-dataflow-hdf-3-2-release-raises-bar-operational-efficiency/
Curing Kafka Blindness with Hortonworks Streams Messaging ManagerHortonworks
With the growth of Apache Kafka adoption in all major streaming initiatives across large organizations, the operational and visibility challenges associated with Kafka are on the rise as well. Kafka users want better visibility in understanding what is going on in the clusters as well as within the stream flows across producers, topics, brokers, and consumers.
With no tools in the market that readily address the challenges of the Kafka Ops teams, the development teams, and the security/governance teams, Hortonworks Streams Messaging Manager is a game-changer.
https://hortonworks.com/webinar/curing-kafka-blindness-hortonworks-streams-messaging-manager/
Interpretation Tool for Genomic Sequencing Data in Clinical EnvironmentsHortonworks
The healthcare industry—with its huge volumes of big data—is ripe for the application of analytics and machine learning. In this webinar, Hortonworks and Quanam present a tool that uses machine learning and natural language processing in the clinical classification of genomic variants to help identify mutations and determine clinical significance.
Watch the webinar: https://hortonworks.com/webinar/interpretation-tool-genomic-sequencing-data-clinical-environments/
IBM+Hortonworks = Transformation of the Big Data LandscapeHortonworks
Last year IBM and Hortonworks jointly announced a strategic and deep partnership. Join us as we take a close look at the partnership accomplishments and the conjoined road ahead with industry-leading analytics offers.
View the webinar here: https://hortonworks.com/webinar/ibmhortonworks-transformation-big-data-landscape/
In this exclusive Premier Inside Out, you will hear from Druid committer Slim Bouguerra, Staff Software Engineer and Product Manager Will Xu. These Hortonworkers will explain the vision of these components, review new features, share some best practices and answer your questions.
View the webinar here: https://hortonworks.com/webinar/hortonworks-premier-apache-druid/
Accelerating Data Science and Real Time Analytics at ScaleHortonworks
Gaining business advantages from big data is moving beyond just the efficient storage and deep analytics on diverse data sources to using AI methods and analytics on streaming data to catch insights and take action at the edge of the network.
https://hortonworks.com/webinar/accelerating-data-science-real-time-analytics-scale/
TIME SERIES: APPLYING ADVANCED ANALYTICS TO INDUSTRIAL PROCESS DATAHortonworks
Thanks to sensors and the Internet of Things, industrial processes now generate a sea of data. But are you plumbing its depths to find the insight it contains, or are you just drowning in it? Now, Hortonworks and Seeq team to bring advanced analytics and machine learning to time-series data from manufacturing and industrial processes.
Blockchain with Machine Learning Powered by Big Data: Trimble Transportation ...Hortonworks
Trimble Transportation Enterprise is a leading provider of enterprise software to over 2,000 transportation and logistics companies. They have designed an architecture that leverages Hortonworks Big Data solutions and Machine Learning models to power up multiple Blockchains, which improves operational efficiency, cuts down costs and enables building strategic partnerships.
https://hortonworks.com/webinar/blockchain-with-machine-learning-powered-by-big-data-trimble-transportation-enterprise/
Delivering Real-Time Streaming Data for Healthcare Customers: ClearsenseHortonworks
For years, the healthcare industry has had problems of data scarcity and latency. Clearsense solved the problem by building an open-source Hortonworks Data Platform (HDP) solution while providing decades worth of clinical expertise. Clearsense is delivering smart, real-time streaming data, to its healthcare customers enabling mission-critical data to feed clinical decisions.
https://hortonworks.com/webinar/delivering-smart-real-time-streaming-data-healthcare-customers-clearsense/
Making Enterprise Big Data Small with EaseHortonworks
Every division in an organization builds its own database to keep track of its business. When the organization becomes big, those individual databases grow as well. The data from each database may become silo-ed and have no idea about the data in the other database.
https://hortonworks.com/webinar/making-enterprise-big-data-small-ease/
Driving Digital Transformation Through Global Data ManagementHortonworks
Using your data smarter and faster than your peers could be the difference between dominating your market and merely surviving. Organizations are investing in IoT, big data, and data science to drive better customer experience and create new products, yet these projects often stall in ideation phase to a lack of global data management processes and technologies. Your new data architecture may be taking shape around you, but your goal of globally managing, governing, and securing your data across a hybrid, multi-cloud landscape can remain elusive. Learn how industry leaders are developing their global data management strategy to drive innovation and ROI.
Presented at Gartner Data and Analytics Summit
Speaker:
Dinesh Chandrasekhar
Director of Product Marketing, Hortonworks
HDF 3.1 pt. 2: A Technical Deep-Dive on New Streaming FeaturesHortonworks
Hortonworks DataFlow (HDF) is the complete solution that addresses the most complex streaming architectures of today’s enterprises. More than 20 billion IoT devices are active on the planet today and thousands of use cases across IIOT, Healthcare and Manufacturing warrant capturing data-in-motion and delivering actionable intelligence right NOW. “Data decay” happens in a matter of seconds in today’s digital enterprises.
To meet all the needs of such fast-moving businesses, we have made significant enhancements and new streaming features in HDF 3.1.
https://hortonworks.com/webinar/series-hdf-3-1-technical-deep-dive-new-streaming-features/
Hortonworks DataFlow (HDF) 3.1 - Redefining Data-In-Motion with Modern Data A...Hortonworks
Join the Hortonworks product team as they introduce HDF 3.1 and the core components for a modern data architecture to support stream processing and analytics.
You will learn about the three main themes that HDF addresses:
Developer productivity
Operational efficiency
Platform interoperability
https://hortonworks.com/webinar/series-hdf-3-1-redefining-data-motion-modern-data-architectures/
Unlock Value from Big Data with Apache NiFi and Streaming CDCHortonworks
Apache NiFi is an easy to use, powerful, and reliable system to process and distribute data. It provides an end-to-end platform that can collect, curate, analyze, and act on data in real-time, on-premises, or in the cloud with a drag-and-drop visual interface. It’s being used across industries on large amounts of data that had stored in isolation which made collaboration and analysis difficult.
Join industry experts from Hortonworks and Attunity as they explain how Apache NiFi and streaming CDC technology provides a distributed, resilient platform for unlocking the value of data in new ways.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
18. 18
Use Case: Single View of Customer
• Ability to identify # unique customers which directly impacts both the top-line ROI measurement and bottom–line optimization
• Increased customer loyalty, LFL sales, average basket size, redemption propensity on promotional activity, listing fees
• Dynamic real time targeted pricing which results in better margins from your most loyal customers
Business
Value
• Better customer experience leading to increased loyalty and customer advocacy
• Increased Marketing Effectiveness leading to higher ROI on every £ spent
• Cross selling and predictive promotional propensity means greater number of manufacturer partnerships
Why
Do It?
• Currently, Retailers, CPG firms and other manufacturers create shopper profiles based on historical data, SKU level data and
Basket data – They however struggle to marry that data with the behavioral data from multiple other channels (mobile, Social
Media, etc.) to map out the DNA of the customer and fail to predict futuristic buying patterns of customers across categories
and products
• Single View of the customer not only allows organizations the ability to create targeted campaigns based on shopping patterns
but also opens up new avenues of revenue streams through advanced marketing efforts such as cross device marketing,
beacon sensing, proximity mktg. etc.
Idea
Summary
The Single View of Customer combines historical sales data from structured systems with new, unstructured and semi-structured
data from social media, sentiment analysis, web activity, and blog posts. Single View of the customer helps create the DNA of the
consumer that can be used to target, re-target, personalize messaging to help address issues around loyalty, churn, cross-selling,
increasing the top line etc.
Innovate –Grow & Enable
19. 19
Contents
à Big Data in Retail
– Digital Revolution
– Explosion of Data
à Big Data Maturity Analysis
– Hortonworks Big Data Maturity Scorecard
– Retail and CPG Maturity Analysis
à Big Data Use Cases
– Retail Use Case Maturity Map
– Single View of Customer
à Big Data in Action
– Retail Case Study
– Call to Action
20. Ø Quick Facts
Quick Facts
• For direct marketing, the lack of visibility into a customer’s credit and financial situation
restricted retailer's ability to pre-screen “right” customers to send the mailers
• Mismatch between Inventory Merchandising Ad Planner and Warehouse Inventory led
to incomplete sales
• Generation of various business reports took days to complete and even after that, not all
the information was available to the Business stakeholders
Situation Analysis
Innovation Strategy
• Retailer built an Enterprise Analytics platform based on Hortonworks Data Platform, breaking-
down silos and increasing historical depth of data available for analysis
• Drove targeted marketing strategy with insight driven customer segmentation analysis,
leveraging new data sources, including the available history
• Implemented near-real time simulation of new Credit Strategy with respect to approval or
decline of application process by collecting exhaustive set of variables needed for credit policy
coding for all customers
Business Impact
• Reduced Spend on Direct Mailers by optimizing mailing by Customer Segment: $3M in first 10
months of 2016 ($4.5 to $5.0M expected run rate savings)
• Reduced ads effectiveness analysis in Product Performance report: 300x improvement in
turnaround
• Reduced associate time in coding for red-flags and lookups for decline rules: 500x time
reduction in implementing credit policy
$3M
Marketing dollars saved to-
date from trimming the direct
mailers
Up to 500x
Time improvement in
implementing credit policy
Up to 45x
Time improvement in
generating Inventory
Merchandising Ad Planner
Digital revolution is transforming the industries. Data strategy is key part of this strategy and 2/3rd are working towards improving it.
In retail, there is big potential of Big Data .. 60% improvement potential in operating margins, 15-20% improvement in marketing potential. This is driven by big transformational changes in retail – mobile devices, IOT sensors and that measure customer movements in store. Cross-channel is becoming important where customers shop across the stores and online
Use the tombstone
Retail firms are in the exploration stage where firms don’t currently have full-fledged Big Data vision and strategy or primarily using structured data. But they are actively working on it – across the board average maturity scores will improve and most of them will be in Optimizing phase. This will be driven by having enterprise-wide vision and strategy, usage of unstructured data and leveraging inhouse and outsourced skill set.
Most of the firms are currently in the Exploration stage
Firms lack enterprise vision around Big Data and funding is unbudgeted
Although most of the firms still use structure data, some have started to collect unstructured data as well
Firms have also started to adopt analytical tools for project specific objectives
Big Data skills are mainly located among technologists and most of the work is outsourced
There is lack formal process for planning Big Data programs
In 2-3 years, firms are planning to attain Optimizing stage
Firms plans to attain enterprise-wide vision and alignment with sponsorship and funding
Firms expect to make big strides in storing their data through Data Lake
Firms will use tools that fit the purpose with centralized administration of tools and integration among the tools
Organizations are investing to gain advanced analytical skills and will leverage mix of in-house and outsourced skill set
Planning and budgeting for Big Data will be part of cyclical budgeting process
In vision and strategy, majority of the firms currently lack enterprise-wide vision. Funding is unbudgeted and seems to come from IT projects, there is little executive sponsorship and there is very little business case development around Big Data.
In 2-3 years, most of the firms will have enterprise-wide vision and strategy for Big Data. Funding will be part of cyclical budgeting process. There will be increased alignment among executive sponsors to support Big Data with Big Data business cases being developed
In Data and Analytics domain, most of the firms are still using structured data while discarding most of the data they collect. The firms are also focused mainly on measuring key business metrics for their business rather than doing advanced analytics.
Over the next 2-3 years, firms plan to leverage unstructured data, store it in the data lakes while keeping the data even if it isn’t being in use at that time. Firms are planning to perform advanced and predictive analytics on top of this data lake.
Currently firms store their data on-premise in the traditional EDWs. They are staring to adopt analytical tools for specific objectives but aren’t able to conduct cross-functional analysis as there is little integration of tools across the organization.
W.r.t. technology, the firms are mainly moving toward hybrid hosting strategy of on-prem and cloud based storage and analysis. As discussed, firms are planning to have data lake, which will be based on multiple Hadoop clusters with tools on top of it that will be integrated and with ability to provide cross-functional insights.
In terms of organization and skills, whatever Big Data skills firms have, are located in the IT organization currently. Firms outsourced quite a bit of work for Big Data projects. Firms also don’t yet have CoE to enable best practices in the whole organization and to achieve cross-group collaboration.
But firms expect things to be different in 2-3 years. Most of the firms are investing in gaining advanced analytical skills and expect mix of in-house and outsources Big data skills set within the organization. Majority of the firms are also planning to have centralized COE group for cross-functional collaboration and institutionalizing best practices.
Within process management, firms currently lack planning around Big Data programs as projects seem to be driven within IT, by IT budgets. Given this, there is hardly any evaluation of results from Big Data projects.
In future, majority of the firms are planning to have budgeting at either business-unit level or at the enterprise-wide. This will result in businesses looking for Big Data programs that drive new value streams and business models. With this, will come more effective measurement around Big Data projects and their outcomes.
Firms do have wide spectrum of capabilities when it comes to data security. Some have basic security and governance process while others do have enterprise-wide standards in these areas. These capabilities will only improve in 2-3 years.
We analyzed the maturity scorecard data for one of the top European retailer. The retailers already has an enterprise-wide vision and strategy, which drives rest of the organization. The form ingests and analyzes unstructured data in Hadoop to perform advanced analytical and predictive skills. It is investing in Big data skills with has already included Big Data programs in its budgeting and planning cycle.
The firm is already in Optimizing stage and will be in Transforming stage in 2-3 years