This document discusses opportunities for using big data in private wealth management. It begins by defining big data and describing how data volumes have increased exponentially. It then outlines several potential use cases for big data in areas like real-time performance metrics, portfolio optimization, and leveraging customer data. For each use case, it describes current limitations and how a big data approach could enable new capabilities. Finally, it proposes a phased approach for wealth managers to identify use cases, prioritize them, implement proofs of concept, and incrementally automate analysis and reporting. The overall message is that big data can enhance analytics and open up new opportunities previously only available to investment banks.
Artificial Intelligence for Banking Fraud PreventionJérôme Kehrli
Artificial Intelligence at NetGuardians:
"From skepticism to large scale adoption towards fraud prevention"
Slides of my speech at the EPFL / EMBA Innovation Leader 2018 event.
Artificial Intelligence and Digital Banking - What about fraud prevention ?Jérôme Kehrli
Artificial intelligence for banking fraud prevention.
A presentation on how it takes its root in the digitalisation ways and how it impacts customer experience.
Analytics in banking preview deck - june 2013Everest Group
This report provides a comprehensive understanding of the analytics services industry with focus on banking domain. Analytics adoption in the banking industry is covered in depth, exploring various aspects such as market size, key drivers, recent analytics initiatives, and challenges. The report also analyses the trends in analytics deals for various banking subverticals (cards, retail, commercial, and lending) and evaluates analytics capabilities of 20+ service providers in the banking space
Fintech workshop Part I - Law Society of Hong Kong - XccelerateHenrique Centieiro
What is fintech? What are the technologies leveraging Fintech? How AI, Blockchain, Cloud and Data Analytics are changing the financial world?
Henrique works as Innovation Project Manager implementing Fintech and Blockchain Projects for the Financial Industry
Find me here: linkedin.com/in/henriquecentieiro
In this presentation Juan M. Huerta talks about big data adoption process at Citi, realising the technical value of big data and global solutions. Huerta goes on to talk about following a hybrid approach, and the future of analytics, expensive algorithms applied to large datasets. With Citi using these approaches in hopes of getting even wider global recognition.
Artificial Intelligence for Banking Fraud PreventionJérôme Kehrli
Artificial Intelligence at NetGuardians:
"From skepticism to large scale adoption towards fraud prevention"
Slides of my speech at the EPFL / EMBA Innovation Leader 2018 event.
Artificial Intelligence and Digital Banking - What about fraud prevention ?Jérôme Kehrli
Artificial intelligence for banking fraud prevention.
A presentation on how it takes its root in the digitalisation ways and how it impacts customer experience.
Analytics in banking preview deck - june 2013Everest Group
This report provides a comprehensive understanding of the analytics services industry with focus on banking domain. Analytics adoption in the banking industry is covered in depth, exploring various aspects such as market size, key drivers, recent analytics initiatives, and challenges. The report also analyses the trends in analytics deals for various banking subverticals (cards, retail, commercial, and lending) and evaluates analytics capabilities of 20+ service providers in the banking space
Fintech workshop Part I - Law Society of Hong Kong - XccelerateHenrique Centieiro
What is fintech? What are the technologies leveraging Fintech? How AI, Blockchain, Cloud and Data Analytics are changing the financial world?
Henrique works as Innovation Project Manager implementing Fintech and Blockchain Projects for the Financial Industry
Find me here: linkedin.com/in/henriquecentieiro
In this presentation Juan M. Huerta talks about big data adoption process at Citi, realising the technical value of big data and global solutions. Huerta goes on to talk about following a hybrid approach, and the future of analytics, expensive algorithms applied to large datasets. With Citi using these approaches in hopes of getting even wider global recognition.
IBM Solutions Connect 2013 - Getting started with Big DataIBM Software India
You've heard of Big Data for sure. But what are the implications of this for your organisation? Can your organisation leverage Big Data too? If you decide to go ahead with your Big Data implementation where do you start? If these questions sound familiar to you then you've stumbled upon the right presentation. Go through the presentation to:
a. Learn more on Big data
b. How Big data can help you outperform in your marketplace.
c. How to proactively manage security and risk
d. How to create IT agility to underpin the business
Also, learn about IBM's superior Big Data technologies and how they are helping today's organisations take smarter decisions and actions.
Welcome to the Age of Big Data in Banking Andy Hirst
Big Data in banking presentation from Sibos Dubai 2013 . What are use cases driving deployments in Banking ? See the use cases SAP is involved In banking in 2013
Presented by Reto Cavegn at the 4th meeting: We would like to present IBM's view on BigData, what the market is requiring, and what products and strategies are evolved out of this requirements. Futher, we will present some reference projects to show, on what use cases customers are working today and what challanges our customers try to solve with BigData. Let me round up with some challenges and lessons we have learned.
Big Data Banking: Customer vs. AccountingHenry Sampson
Core Banking Systems have evolved from treating customer data as a peripheral of transactions to more and more a central focus of the system. Thi s presentation explores how DreamOval is positioning Bank Nurse to meet this new reality of store more customer data than transactions
Building Confidence in Big Data - IBM Smarter Business 2013 IBM Sverige
Success with big data comes down to confidence. Without confidence in the underlying data, decision makers may not trust and act on analytic insight. You need confidence in your data – that it’s correct, trusted, and protected through automated integration, visual context, and agile governance. You need confidence in your ability to accelerate time to value, with fast deployments of big data appliances. Learn how clients have succeeded with big data by building confidence in their data, ability to deploy, and skills. Presenter: David Corrigan, Big Data specialist, IBM. Mer från dagen på http://bit.ly/sb13se
Big data & analytics for banking new york lars hambergLars Hamberg
BIG DATA & ANALYTICS FOR BANKING SUMMIT, New York, 1 Dec 2015.
Keynote address: "How Predictive Analytics will change the Financial Services Sector”
Speaker : Lars Hamberg
http://www.specialistspeakers.com/?p=8367
Overview & Outlook: Why Big Data will over-deliver on its hype and transform Financial Services; Use cases with Advanced Analytics and Big Data Analytics in Financial Services, in Production & Distribution of banking products; new opportunities for incumbents in tomorrow’s ecosystem; big data, bigdata, analytics, smart data, data analytics, digitization, digitalization, predictive analytics, sentiment analysis, financial services, banking, asset management, distribution, retail, trading, technology, innovation, fintech, wealth, asset management, investment industry, robo advisory, social investing, behavior, profiling, client segmentation, alias matching, semantic memory models, unstructured data, machine learning, pattern recognition
[Ai in finance] AI in regulatory compliance, risk management, and auditingNatalino Busa
AI to Improve Regulatory Compliance, Governance & Auditing. How AI identifies and prevents risks, above and beyond traditional methods. Techniques and analytics that protect customers and firms from cyber-attacks and fraud. Using AI to quickly and efficiently provide evidence for auditing requests.
Wondering how to bring services to your clients in real time – and on their preferred device? Need to automate your financial supply chain, including risk and compliance functions, and move to a pay for performance model?
Learn about use cases from within the big data ecosystem, ranging from AML compliance, trade lifecycle, fraud detection and digital transformation, and introduce their risk data aggregation and compliance initiative. Find out how you can best leverage Open Enterprise Hadoop to achieve these goals.
Big Data Monetisation
PSD were pleased to host a breakfast at the Royal Horseguards Hotel discussing the subject of what companies can do with their data to monetise it and bring the debate to the CEO's office.
Leading the discussion, and presenting his portfolio of work in this area was Mike Fishwick.
Mike has recently led the Business Insights programme at Telefonica Digital, and has an almost unique viewpoint on the application of data science in this area.
Attending were technology leaders from a broad range of sectors all of whom are investigating what they do with the ever increasing torrent of data that they are managing.
For the latest IT & Business Change jobs & salary survey information, go to:
http://www.psdgroup.com/information_technology.aspx
Chris Eldridge - MD
CGI's Steve Starace, SVP & BU Leader, U.S. Northeast explains how CGI’s solutions and services are addressing clients’ top priorities in the banking industry.
Big Data presence in the high volume in the data storages can help in various ways to learn more about the need and trends of the current market which will be useful for all type of organizations. Modern information technology used to analyze the relationship between social trends and market insights is a useful way to have indirectly interlinked to customers and their interests from unstructured and semi-structured data. Such analysis will give organizations a broader view towards the practical needs of customers and once banking industry or any industry could know the customers, they can serve better and with more flexibility. In this presentation, team has primarily created the platform and designed the architecture in big data technology for banking industry to maximize the users of credit card.
Accelerating Data-Driven Enterprise Transformation in Banking, Financial Serv...Denodo
Watch full webinar here: https://bit.ly/3c6v8K7
Banking, Financial Services and Insurance (BFSI) organizations are globally accelerating their digital journey, making rapid strides with their digitization efforts, and adding key capabilities to adapt and innovate in the new normal.
Many companies find digital transformation challenging as they rely on established systems that are often not only poorly integrated but also highly resistant to modernization without downtime. Hear how the BFSI industry is leveraging data virtualization that facilitates digital transformation via a modern data integration/data delivery approach to gain greater agility, flexibility, and efficiency.
In this session from Denodo, you will learn:
- Industry key trends and challenges driving the digital transformation mandate and platform modernization initiatives
- Key concepts of Data Virtualization, and how it can enable BFSI customers to develop critical capabilities for real-time / near real-time data integration
- Success Stories on organizations who already use data virtualization to differentiate themselves from the competition.
Big Data kennen sehr viele IT-Experten, wenigstens haben Sie eine Vorstellung davon. In der Praxis arbeiten damit in Deutschland derzeit nur wenige. Dabei bringt Big Data ein ganz neues Momentum in moderne Softwarelösungen und ist im Kontext der Mobil-, Cloud- und Social-Veränderungen nicht wegzudenken. Big Data macht Software intelligent und damit auf eine ganz neue Art für die Benutzer erlebbar. Mit Big Data entstehen neue Softwarearchitekturen, weil Informationen völlig anders verarbeitet werden - nämlich schneller, differenzierter und oft mit dem Ziel, Schlüsse zu ziehen und Vorhersagen zu treffen.
In diesem Vortrag wird erläutert, wie moderne Softwarearchitekturen gestaltet werden, sodass Sie Big Data Paradigmen erfolgreich umsetzen und welche Vorteile sich für die zunehmend mobilen Softwarelösungen ergeben. Wir werfen zudem einen Blick auf die Potentiale und Optionen in Branchen wie Banken, Versicherung oder Handel.
IBM Solutions Connect 2013 - Getting started with Big DataIBM Software India
You've heard of Big Data for sure. But what are the implications of this for your organisation? Can your organisation leverage Big Data too? If you decide to go ahead with your Big Data implementation where do you start? If these questions sound familiar to you then you've stumbled upon the right presentation. Go through the presentation to:
a. Learn more on Big data
b. How Big data can help you outperform in your marketplace.
c. How to proactively manage security and risk
d. How to create IT agility to underpin the business
Also, learn about IBM's superior Big Data technologies and how they are helping today's organisations take smarter decisions and actions.
Welcome to the Age of Big Data in Banking Andy Hirst
Big Data in banking presentation from Sibos Dubai 2013 . What are use cases driving deployments in Banking ? See the use cases SAP is involved In banking in 2013
Presented by Reto Cavegn at the 4th meeting: We would like to present IBM's view on BigData, what the market is requiring, and what products and strategies are evolved out of this requirements. Futher, we will present some reference projects to show, on what use cases customers are working today and what challanges our customers try to solve with BigData. Let me round up with some challenges and lessons we have learned.
Big Data Banking: Customer vs. AccountingHenry Sampson
Core Banking Systems have evolved from treating customer data as a peripheral of transactions to more and more a central focus of the system. Thi s presentation explores how DreamOval is positioning Bank Nurse to meet this new reality of store more customer data than transactions
Building Confidence in Big Data - IBM Smarter Business 2013 IBM Sverige
Success with big data comes down to confidence. Without confidence in the underlying data, decision makers may not trust and act on analytic insight. You need confidence in your data – that it’s correct, trusted, and protected through automated integration, visual context, and agile governance. You need confidence in your ability to accelerate time to value, with fast deployments of big data appliances. Learn how clients have succeeded with big data by building confidence in their data, ability to deploy, and skills. Presenter: David Corrigan, Big Data specialist, IBM. Mer från dagen på http://bit.ly/sb13se
Big data & analytics for banking new york lars hambergLars Hamberg
BIG DATA & ANALYTICS FOR BANKING SUMMIT, New York, 1 Dec 2015.
Keynote address: "How Predictive Analytics will change the Financial Services Sector”
Speaker : Lars Hamberg
http://www.specialistspeakers.com/?p=8367
Overview & Outlook: Why Big Data will over-deliver on its hype and transform Financial Services; Use cases with Advanced Analytics and Big Data Analytics in Financial Services, in Production & Distribution of banking products; new opportunities for incumbents in tomorrow’s ecosystem; big data, bigdata, analytics, smart data, data analytics, digitization, digitalization, predictive analytics, sentiment analysis, financial services, banking, asset management, distribution, retail, trading, technology, innovation, fintech, wealth, asset management, investment industry, robo advisory, social investing, behavior, profiling, client segmentation, alias matching, semantic memory models, unstructured data, machine learning, pattern recognition
[Ai in finance] AI in regulatory compliance, risk management, and auditingNatalino Busa
AI to Improve Regulatory Compliance, Governance & Auditing. How AI identifies and prevents risks, above and beyond traditional methods. Techniques and analytics that protect customers and firms from cyber-attacks and fraud. Using AI to quickly and efficiently provide evidence for auditing requests.
Wondering how to bring services to your clients in real time – and on their preferred device? Need to automate your financial supply chain, including risk and compliance functions, and move to a pay for performance model?
Learn about use cases from within the big data ecosystem, ranging from AML compliance, trade lifecycle, fraud detection and digital transformation, and introduce their risk data aggregation and compliance initiative. Find out how you can best leverage Open Enterprise Hadoop to achieve these goals.
Big Data Monetisation
PSD were pleased to host a breakfast at the Royal Horseguards Hotel discussing the subject of what companies can do with their data to monetise it and bring the debate to the CEO's office.
Leading the discussion, and presenting his portfolio of work in this area was Mike Fishwick.
Mike has recently led the Business Insights programme at Telefonica Digital, and has an almost unique viewpoint on the application of data science in this area.
Attending were technology leaders from a broad range of sectors all of whom are investigating what they do with the ever increasing torrent of data that they are managing.
For the latest IT & Business Change jobs & salary survey information, go to:
http://www.psdgroup.com/information_technology.aspx
Chris Eldridge - MD
CGI's Steve Starace, SVP & BU Leader, U.S. Northeast explains how CGI’s solutions and services are addressing clients’ top priorities in the banking industry.
Big Data presence in the high volume in the data storages can help in various ways to learn more about the need and trends of the current market which will be useful for all type of organizations. Modern information technology used to analyze the relationship between social trends and market insights is a useful way to have indirectly interlinked to customers and their interests from unstructured and semi-structured data. Such analysis will give organizations a broader view towards the practical needs of customers and once banking industry or any industry could know the customers, they can serve better and with more flexibility. In this presentation, team has primarily created the platform and designed the architecture in big data technology for banking industry to maximize the users of credit card.
Accelerating Data-Driven Enterprise Transformation in Banking, Financial Serv...Denodo
Watch full webinar here: https://bit.ly/3c6v8K7
Banking, Financial Services and Insurance (BFSI) organizations are globally accelerating their digital journey, making rapid strides with their digitization efforts, and adding key capabilities to adapt and innovate in the new normal.
Many companies find digital transformation challenging as they rely on established systems that are often not only poorly integrated but also highly resistant to modernization without downtime. Hear how the BFSI industry is leveraging data virtualization that facilitates digital transformation via a modern data integration/data delivery approach to gain greater agility, flexibility, and efficiency.
In this session from Denodo, you will learn:
- Industry key trends and challenges driving the digital transformation mandate and platform modernization initiatives
- Key concepts of Data Virtualization, and how it can enable BFSI customers to develop critical capabilities for real-time / near real-time data integration
- Success Stories on organizations who already use data virtualization to differentiate themselves from the competition.
Big Data kennen sehr viele IT-Experten, wenigstens haben Sie eine Vorstellung davon. In der Praxis arbeiten damit in Deutschland derzeit nur wenige. Dabei bringt Big Data ein ganz neues Momentum in moderne Softwarelösungen und ist im Kontext der Mobil-, Cloud- und Social-Veränderungen nicht wegzudenken. Big Data macht Software intelligent und damit auf eine ganz neue Art für die Benutzer erlebbar. Mit Big Data entstehen neue Softwarearchitekturen, weil Informationen völlig anders verarbeitet werden - nämlich schneller, differenzierter und oft mit dem Ziel, Schlüsse zu ziehen und Vorhersagen zu treffen.
In diesem Vortrag wird erläutert, wie moderne Softwarearchitekturen gestaltet werden, sodass Sie Big Data Paradigmen erfolgreich umsetzen und welche Vorteile sich für die zunehmend mobilen Softwarelösungen ergeben. Wir werfen zudem einen Blick auf die Potentiale und Optionen in Branchen wie Banken, Versicherung oder Handel.
Watch full webinar here: https://bit.ly/2Y0vudM
What is Data Virtualization and why do I care? In this webinar we intend to help you understand not only what Data Virtualization is but why it's a critical component of any organization's data fabric and how it fits. How data virtualization liberates and empowers your business users via data discovery, data wrangling to generation of reusable reporting objects and data services. Digital transformation demands that we empower all consumers of data within the organization, it also demands agility too. Data Virtualization gives you meaningful access to information that can be shared by a myriad of consumers.
Register to attend this session to learn:
- What is Data Virtualization?
- Why do I need Data Virtualization in my organization?
- How do I implement Data Virtualization in my enterprise?
In this presentation we review the basic architecture behind SQL Server StreamInsight.
Regards,
Ing. Eduardo Castro Martínez, PhD – Microsoft SQL Server MVP
http://mswindowscr.org
http://comunidadwindows.org
Costa Rica
Technorati Tags: SQL Server
LiveJournal Tags: SQL Server
del.icio.us Tags: SQL Server
http://ecastrom.blogspot.com
http://ecastrom.wordpress.com
http://ecastrom.spaces.live.com
http://universosql.blogspot.com
http://todosobresql.blogspot.com
http://todosobresqlserver.wordpress.com
http://mswindowscr.org/blogs/sql/default.aspx
http://citicr.org/blogs/noticias/default.aspx
http://sqlserverpedia.blogspot.com/
This Presentation is completely on Big Data Analytics and Explaining in detail with its 3 Key Characteristics including Why and Where this can be used and how it's evaluated and what kind of tools that we use to store data and how it's impacted on IT Industry with some Applications and Risk Factors
Strategizing Big Data in Telco
Big data feels to be a very hot topic nowadays. Some industries depend on it completely, some have opportunities to roll out their strategies and execute, some just considering when it is a right time to hop in.
To my mind, Big Data is not about technology. Big data is about people generating data and data used for the benefit of people.
Big data is a pool of activities intended at processing the data a company owns (internal and external) so that to open new revenue opportunities, minimize costs and enhance UX.
I had some ideas and thoughts on what telecommunication companies may start from in formulating the Big Data Strategy and so packed some of the most important pieces of thoughts into a small presentation.
What is the difference between Small Data and Big Data?
What kind of data is used currently and which is to be relied on a new paradigm?
What kind of products are expected from telcos?
My personal ranking of operators in terms of their Big Data execution
What are the stages telcos should pass through to become a Big Data operator?
Prerequisites for Big Data transformation
Please take a look at the presentation to find answers to these questions and feel free to share your opinion.
Thanks!
Cloud & Big Data - Digital Transformation in Banking Sutedjo Tjahjadi
Datacomm Cloud Business Overview
Making Indonesia 4.0
Digital Transformation in Banking Industry
Introduction to Cloud Computing
Big Data Analytics Introduction
Big Data Analytics Application in Banking
Introduction to Modern Data Virtualization 2021 (APAC)Denodo
Watch full webinar here: https://bit.ly/2XXyc3R
“Through 2022, 60% of all organisations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch on-demand this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
Introduction to Modern Software ArchitectureJérôme Kehrli
This talk offers an introduction to software architecture with a modern perspective. We will consider a new way to identify architectural elements and walk through some examples of modern architectures, the NoSQL world, Big Data architectures and micro-services.
A proposed framework for Agile Roadmap Design and MaintenanceJérôme Kehrli
Maintaining a relevant and meaningful roadmap while adopting a state of the art Agile methodology is challenging and somewhat antonymous.
This presentation proposes a framework for designing and maintaining an Agile Roadmap.
A presentation of the search for Product-Market Fit with the principles, practices and processes that lead to it, from the Lean-Startup and Design Thinking perspective
From Product Vision to Story Map - Lean / Agile Product shapingJérôme Kehrli
A lot of Software Engineering projects fail for a lack of shared vision due to poor communication among people involved in the project.
A sound maintenance of the product backlog can only be achieved if all the people have a good understanding of what they have to do (common vision).
Roman Pichler, in a post originally written in Jul 16 2012, has proposed a really interesting approach: use various canvas to create and share product vision and product backlog creation and refinement.
This presentation is a drive through these various boards and canvas that should be designed in prior to any product development: the Product Vision, the Lean Canvas, The Product Definition and the Story Map.
Introduction to NetGuardians' Big Data Software StackJérôme Kehrli
NetGuardians is executing it's Big Data Analytics Platform on three key Big Data components underneath: ElasticSearch, Apache Mesos and Apache Spark. This is a presentation of the behaviour of this software stack.
Periodic Table of Agile Principles and PracticesJérôme Kehrli
Recently I fell by chance on the Periodic Table of the Elements... Long time no see... Remembering my physics lessons in University, I always loved that table. I remembered spending hours understanding the layout and admiring the beauty of its natural simplicity.
So I had the idea of trying the same layout, not the same approach since both are not comparable, really only the same layout for Agile Principles and Practices.
The result is in this presentation: The Periodic Table of Agile Principles and Practices:
Agility and planning : tools and processesJérôme Kehrli
In this presentation, I intend to present the fundamentals, the roles, the processes, the rituals and the values that I believe a team would need to embrace to achieve success down the line in Agile Software Development Management - Product Management, Team Management and Project Management - with the ultimate goal of making planning and forecasting as simple and efficient as it can be.
Bytecode manipulation with Javassist for fun and profitJérôme Kehrli
Java bytecode is the form of instructions that the JVM executes.
A Java programmer, normally, does not need to be aware of how Java bytecode works.
Understanding the bytecode, however, is essential to the areas of tooling and program analysis, where the applications can modify the bytecode to adjust the behavior according to the application's domain. Profilers, mocking tools, AOP, ORM frameworks, IoC Containers, boilerplate code generators, etc. require to understand Java bytecode thoroughly and come up with means of manipulating it at runtime.
Each and every of these advanced features of what is nowadays standard approaches when programming with Java require a sound understanding of the Java bytecode, not to mention completely new languages running on the JVM such as Scala or Clojure.
Bytecode manipulation is not easy though ... except with Javassist.
Of all the libraries and tools providing advanced bytecode manipulation features, Javassist is the easiest to use and the quickest to master. It takes a few minutes to every initiated Java developer to understand and be able to use Javassist efficiently. And mastering bytecode manipulation, opens a whole new world of approaches and possibilities.
DevOps is a methodology capturing the practices adopted from the very start by the web giants who had a unique opportunity as well as a strong requirement to invent new ways of working due to the very nature of their business: the need to evolve their systems at an unprecedented pace as well as extend them and their business sometimes on a daily basis.
While DevOps makes obviously a critical sense for startups, I believe that the big corporations with large and old-fashioned IT departments are actually the ones that can benefit the most from adopting these principles and practices.
Digitalization: A Challenge and An Opportunity for BanksJérôme Kehrli
Today’s banking industry era is strongly defined by a word - digital. The urgency to act is only getting severe each day. Banks using digital technologies to automate processes, improve regulatory compliance, and transform the customer experience may realize a profit upside of 40% or more, while laggards that resist digital innovation will be punished by customers, financial markets, regulators, and may see up to 35% of net profit eroded, according to a McKinsey analysis.
The vital question to answer is, do we get digitalization right? Why is it getting extremely urgent to digitize?
Some years ago, Eric Ries, Steve Blank and others initiated The Lean Startup movement. The Lean Startup is a movement, an inspiration, a set of principles and practices that any entrepreneur initiating a startup would be well advised to follow.
Projecting myself into it, I think that if I had read Ries' book before, or even better Blank's book, I would maybe own my own company today, around AirXCell or another product, instead of being disgusted and honestly not considering it for the near future.
In addition to giving a pretty important set of principles when it comes to creating and running a startup, The Lean Startup also implies an extended set of Engineering practices, especially software engineering practices.
Smart Contracts are a central component to next-generation blockchain platforms. Blockchain technology is much broader than just bitcoin. The sustained levels of robust security achieved by public cryptocurrencies have demonstrated to the world that this new wave of blockchain technologies can provide efficiencies and intangible technological benefits very similar to what the internet has done.
Blockchains are a very powerful technology, capable of going much further than only "simple" financial transaction; a technology capable of performing complex operations, capable of understanding much more than just how many bitcoins one currently has in his digital wallet.
This is where the idea of Smart Contracts come in. Smart Contracts are in the process of becoming a cornerstone for enterprise blockchain applications and will likely become one of the pillars of blockchain technology.
In this presentation, we will explore what a smart contract is, how it works, and how it is being used.
The Blockchain - The Technology behind Bitcoin Jérôme Kehrli
The blockchain and blockchain related topics are becoming increasingly discussed and studied nowadays. There is not one single day where I don't hear about it, that being on linkedin or elsewhere.
I interested myself deeply in the blockchain topic recently and this is the first article of a coming whole serie around the blockchain.
This presentation is an introduction to the blockchain, presents what it is in the light of its initial deployment in the Bitcoin project as well as all technical details and architecture concerns behind it.
We won't focus here on business applications aside from what is required to present the blockchain purpose, more concrete business applications and evolutions will be the topic of another presentation I'll post in a few weeks
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
2. 2
1. What is Big Data ?
2. Opportunities in Wealth Management ?
1. Real time performance and risk metrics
2. Portfolio optimization and simulation
3. Leveraging customer data
3. How to get there ?
Agenda
8. 9
Data deluge !
Data deluge
5 exabytes of data
(5 billions of gigabytes)
Were generated
since the first
measurements
until 2003,
In 2011, this
quantity was
generated in
2 days
In 2018, this
quantity was
generated in
2 minutes
Source:https://www.emc.com/collateral/analyst-reports/idc-the-digital-universe-in-2020.pdf
12. 13
Technical capability evolution
For the 40 years, the IT component capabilties grew exponentially
CPU, RAM, Disk … follow the Moore law!
Source :
http://radar.oreilly.com/2011/08/building-data-startups.html
13. 14
Storage cost evolution
While the unit cost decreasing…
Unit capability increasing and cost decreasing, the vertical scalability is always the easiest solution,
really?
0.01 $
0.10 $
1.00 $
10.00 $
100.00 $
1,000.00 $
10,000.00 $
100,000.00 $
1,000,000.00 $
10,000,000.00 $
1975 1980 1985 1990 1995 2000 2005 2010 2015
Hard Drive
RAM
Source :http://www.mkomo.com/cost-per-gigabyte
5 $
/GB
5 M$
/GB
14. 15
Disk throughput evolution
Issue : The throughput evolution is always lower than the capacity evolution
How read/write more and more data through an always thicker pipe?
Gain : x100 000
Capacity:
Gain : x 10’000
In 10 years
Throughput
Gain : x 50
In 10 years
16. 17
How to process data that cannot fit in memory anymore ?
The Web Giants have been the first to face the limits of traditional architectures
New architectures and paradigms
The massive computations are the reason why we need new architetcures and paradigms
such as Hadoop or No / New SQL
Idea #1 : Run transaction and computation in parallel
Idea #2 : Scale at the multi-datacenter level the grid of CPU, DRAM and Disk
Idea #3 : Move the code to the computing node, not the data (tier layer revolution)
RAM
How to read thousands of files of billions of rows
in a limited time?I / O
How to process trillions of operations in a limited time?CPU
17. 18
So what is big data?
Defining big Data is actually beyond the formal definition.
It’s all together a technology evolution anticipated by the Big Consulting companies and a business
opportunity !
Big
Data
More and
different
data
Evolution of
Datascience
More
computing
capacities
New
technologies
and
architectures
18. 19
Big data represents the information
assets characterized by such a high
volume, velocity and variety to require
specific technology and analytical
methods for its transformation into value
Big Data : definition
https://en.wikipedia.org/wiki/Big_data#Definition
19. 20
Existing Architectures Limits
IO Limit
Applications are heavily storage oriented
CPU Limit
Applications are heavily computation oriented
TPS Limit (Transaction Per Second)
Applications are heavily high-troughput oriented
Transactional Applications / Operational Information Systems
EPS Limit (Event Per Second)
Applications are heavily ultra low-latency oriented
Event Flow applications
20. 21
Existing Architectures Limits
IO Limit
Applications are heavily storage oriented
CPU Limit
Applications are heavily computation oriented
TPS Limit (Transaction Per Second)
Applications are heavily high-troughput oriented
Transactional Applications / Operational Information Systems
EPS Limit (Event Per Second)
Applications are heavily ultra low-latency oriented
Event Flow applications
Over 10 Tb, « classical »
architectures requires
huge software and
hardware adaptations.
Over 1 000 transactions /
second, « classical »
architectures requires
huge software and
hardware adaptations.
Over 1 000 events /
second, « classical »
architectures requires
huge software and
hardware adaptations.
Over 10 threads/Core
CPU, sequential
programming reach its
limits (IO).
21. 22
Traditional and Specialized Architectures
IO Limit
Applications are heavily storage oriented
CPU Limit
Applications are heavily computation oriented
TPS Limit (Transaction Per Second)
Applications are heavily high-troughput oriented
Transactional Applications / Operational Information Systems
EPS Limit (Event Per Second)
Applications are heavily ultra low-latency oriented
Event Flow applications
Storage Grids
Distributed Storage /
Share nothing grids
Transaction Grids
XTP
Stream Grids
Event Stream
Processing
Computation Grids
Parallel Processing
Traditional Architectures
RDBMS, Application
servers, ETLs, ESBs, etc.
22. 23
Traditional Architectures
RDBMS, Application
servers, ETLs, ESBs, etc.
New Architectures
IO Limit
Applications are heavily storage oriented
CPU Limit
Applications are heavily computation oriented
TPS Limit (Transaction Per Second)
Applications are heavily high-troughput oriented
Transactional Applications / Operational Information Systems
EPS Limit (Event Per Second)
Applications are heavily ultra low-latency oriented
Event Flow applications
Hadoop
Usual hadoop
ecosystem
NoSQL /
NewSQL
In-memory
Analytics
Other
Big
Data
Processing
Engines
Streaming
Solution
CPU
Grids GPU
Grids
23. 24
Some examples
IO Limit
Applications are heavily storage oriented
CPU Limit
Applications are heavily computation oriented
TPS Limit (Transaction Per Second)
Applications are heavily high-troughput oriented
Transactional Applications / Operational Information Systems
EPS Limit (Event Per Second)
Applications are heavily ultra low-latency oriented
Event Flow applications
RabbitMQ,
Zero MQ
Apache Kafka
Quartet
ActivePivot
Sqoop
Exalytics
HDFS
Exadata
EMC
Teradata
SQLFire
Giga Spaces
HBase
Cassandra,
MongoDB,
CouchDB
Voldemort
Hana
Redis
MapR
Esper
Hama Igraph
Spark /
Spark Streaming
Hive
Pig
25. 26
Big Data in Wealth Management
Investment Research
Data discovery / market research
Development of Investment ideas
Testing of investment strategies
Portfolio Management
Trading
Risk Management
Aggregation of position Data
Position monitoring
Risk dashboards
Portfolio Management
Customer knowledge
Unified / Consolidated customer view
Customer profiling and analysis
Know Your Customer / External Data
Analysis of unstructured data
Client knowledge
Investment advisory
Compliance and Monitoring
Pre / Post-Trade
Fraud detection and prevention
Anti Money Laundering
Communication channels monitoring
State of
the Art
use cases
1
2
3
26. 27
Portfolio optimization
and simulation
Focus on some use cases
Real-time performance
and risk metrics
Intra-day positions and trades
Risk dashboards
Solvency ratios for credit approval
Fraud prevention / Anti-Money
laundering
What if we had taken this or that
investment decision ?
What about this investment strategy ?
Large scale portfolio optimization
Leveraging Customer data
Customer profiling / classification
Personalized investment advices
Better / deeper fraud detection
Marketing campaigns
28. 29
Portfolio metrics
Portfolio metrics
Performance
Contributions / Exposure
Risk metrics
Variance
Sharp Ratio
Volatility
Value at Risk
breakdown by
Region / Country / Currency / …
Sector / Industry / …
Global metrics
Same as portfolio …
breakdown by Office, LoB, Contract, Customer, etc.
also breakdown by Country, Currency, Sector, Industry, etc.
Portfolio metrics in private banking institutions
Val / perf : night computation batches for
Portfolio risk metrics : WE batches
Global metrics : quarterly batches
29. 30
Make it real-time …
The disastrous global financial crisis put a spotlight on the need to get rapid feedback after
market events.
Banks are trying to obtain an array of risk metrics in more real time, released multiple times
during the day
Reduce time to result as much as possible
Take intraday positions and trades into account
Get immediate feedback on intraday market events
Using latest quotes and other metrics
30. 31
The good ol’way
1. Night / WE / Quarterly computation batches
Missing intraday positions / live quotes
Far from real time
2. Intraday calculation within the Operational IS
Everything in the RDBMS
Load / reload / compute / re-compute again
Heavy load on Operational IS / slow computation
Very long response time (several minutes) / crash …
3. Efficient off-Operational IS computation (rare …)
Distributed cache (Jboss Cache)
Reload time / Huge operating costs
Computing Grids (Terradata, …)
Huge licensing cost
31. 32
Commodity hardware
Reduce TCO
Scale Out
Open-source software stack
No licensing Cost
Ease of operation (Web giants initiated)
From Pull to Push
Computing Portfolio or global Perf or VaR in real-time is difficult
Processing market events and updating metrics in real-time is straightforward
We have the technology
The Big Data Way
33. 34
Portfolio optimization and simulation
Portfolio Optimization
Markovitz Mean-Variance (MPT)
Mean-CVaR
Custom investment constraints
Etc.
Portfolio simulation
What if we had taken this or that
investment decision ?
What about this investment strategy ?
What if we try such policy or
regulation approach ?
Most PMS software support simulation and some optimization models out of the box.
What about yours ? Do you use these features ?
What about large scale portfolio optimization ?
34. 35
Portfolio optim. – bactkesting and stress testing
Portfolio backtesting / stresstesting
o Backtest over past periods
o This is Markowitz
o Test optimization parameters
o Stress-test over market events
o A lot of computations …
Stress-testing and back-testing are a little less common …
What do you have ?
What about large scale backtesting ?
35. 36
Even further : compute portfolio efficient frontier
Efficient Frontier calculation
o Sharp ratio
o Heuristic computation
+ Lots of computations
o With respect to constraints
o Dynamic constraints
o Compute result weights
Very rare in most financial companies
What do you have ?
What about large scale sharp ratio optimization ?
36. 37
The good ol’way
1. Excel – Quantitative analysts
Use extract from Operational IS … loaded In Excel
Simulations / rebalancings are run from Excel
Slow / inadapted / buggy
2. Calc. Program - Quantitative analysts
Use extract from operational IS … loaded In Matlab
Simulations / rebalancings are run from Matlab
Steady learning curve
Poor User Interface
3. Proprietary/ specialized software (Reuters, BB, …)
To be integrated on Operational IS
Tricky / Risky
Expensive…
Inflexible
37. 38
The Big Data Way
> Findings …
+ Quantitative research tooling is sub-optimal in most private banking institutions
+ Some needs are more or less covered
+ Yet we are far from a large scale and systemic approach to portfolio optimization
and back-testing
Increase computing capacities with eXtreme parallelization / distribution
Large scale distributed systems
Move the computing code to the data
Reduce TCO
Open source software stacks
Hadoop, Cassandra, Infinispan, Python SciKit, R, …
Commodity Hardware
39. 40
Leveraging Customer data
• Customer personal data and their changes
• Investment profiles
Customer profiles
• All customer and prospects contacts
• Documents / Phone calls / emails / …
CRM activity
• Financial transactions / account activity
• Orders and confirmations
Transactions and activity
• Bank online systems / ebanking
• Sometimes even firewall traces
Online activity
• What if we add external data ?
• Twitter / facebook activity
External data
Private banking institutions are keeping tracks of a lot of data
This huge amount of data is kept but rarely used
Can we make something of it ?
40. 41
Opportunities can be examined from two perspectives:
Customer Data : new opportunities ?
Big Data as a
« Cost killer »
or an enhancer
Big Data as a way to
« widen the field of
possibilities »
41. 42
What can we do with it anyway ?
Fraud detection and AML
> Real-time transaction monitoring
+ Identify patterns and outliers
> Communication channels monitoring
+ Mining of emails, calls, logs, …
> Web-site / Firewall monitoring
Customer analysis
> Customer segmentation
+ Growth potential / risk metrics / …
> Customer profiling
+ Investment profiles
+ Peer groups.
> Prospection analysis
> Marketing campaigns
> Invest on the most promising
customers
> Personalized investment advices
> Benchmark customers
> Match firewall / website breach
attempts to account activity
> Better / deeper fraud detection
> Anty-Money Laundering
42. 43
What can we do with it anyway ? (cont’d)
Online banking software
> Advanced searched on transactions
+ Multi-criterias : date, name of shop,
revenues, full-text search
> Annotate transactions, shops,
compare, get advises, etc.
Long term logs of all transactions
Know Your Customer
> Customer 360 view …
+ Everything with contacts, transactions,
performance, risk metrics, …
> Customer identification program
+ Matching with Web and social
networks
> User Experience Revolution
> Marketing / Digital Experience
> User-engineering
> Fast access to all customer data
> Unified view of the customer
> Real-time construction / display
43. 44
The Big Data Way
Manipulate and consolidate very large volumes of data efficiently
HDFS and Other No/NewSQL storages
Highly distributed computing
Reduce TCO
Open source software stacks
Hadoop, Cassandra, Infinispan, Python SciKit-Learn, R, …
Commodity Hardware
Manipulating, consolidating and mining all the data related to customer and/or
users activities resulting from heterogeneous sources is difficult
Some initiatives already exist in most institutions.
They are however most of the time limited to small and specific sets of data
Or implemented on expensive technologies such as Terradata
45. 46
Improve Analytics System
Reduce TCO / Cost Killer !
Extend the field of possibilities
Consider use cases so far reserved to investment banking institutions
Large scale portfolio optimization, simulation, rebalancing, back-testing, etc
Real-time metrics
Widen the field of possibilities
User Experience Revolution
Marketing / Digital Experience
User-engineering
Customer 360 view
KYC
Deeper depth of analysis
Take Away
46. 47
Architecture
Data acquisition Collection and Analysis
InfiniSpan
(DataGrid)
Live Market
Data
• Instruments
• Quotes
• Index values
Storm
(Real-time)
Portfolios
• Simulation
• Optimization
• Rebalancing
• Backtesting
• Stresstesting
Historical
Market
Data
• Instruments
• Quotes
• Index values
Cassandra
(DB)
RestitutionExternal Data
• Web API
• Web searches
• Social networks
(PULL mode)
Operational Data
• Transactions
• Acccount Activity
• Mails / Calls
• Logs
• PMS :
Portfolios/
Positions
• Trades /
• Accounts / Structure
• Reference Data
Real-time and
batch metrics
• Portfolio perf.
• Risk metrics
• Real-time
Dashboards
Hadoop
/ HDFS
/ Hbase
(Data Storage
And
Analysis)
Tulip /
Hive /
Pig
(Querying / Data Viz /
Results Reporting)
Analysis
• Customer
knowledge
• Fraud
detection
Live Mkt. Data
Market Data
Accounting
Reference
PMS Data
(Unstructured
Data)
Account Act.
Transactions
Big Data Deployment
In Private Banking Institutions
Expert System
(Portfolio simulation,
optimization, etc.)
47. 48
1. Identify your potential use cases
2. Prioritize use cases with business experts / representatives
3. Identify technological leads / opportunities
4. Implement Proof Of Concepts
Incremental / Iterative way :
How to get there ?
• One time
extract
• Scheduled
extractors
Import Data
within target
Technology
• Pig / Hive
• DataViz
• Storm,
Infinispan
Discover Data
and technology • New analysis
• Reporting
• Automation
Automate /
Analyze Data
An evolving society
Yesterday – in 2008, we were amazed by the first smartphones.
Today they have almost become a part of ourselves. We cannot go without them anymore.
We are looking at our smartphone 150 times a day.
Is it the biggest invention of the decade ? Likely, but the previous decade, not the current one.
Today : always connected / interconnected people
Tomorrow : the internet of things
The Internet of Things (IoT) refers to uniquely identifiable objects and their interconnection on internet, as well as their automatic exchange of information with third party services.
An evolving society
Yesterday – in 2008, we were amazed by the first smartphones.
Today they have almost become a part of ourselves. We cannot go without them anymore.
We are looking at our smartphone 150 times a day.
Is it the biggest invention of the decade ? Likely, but the previous decade, not the current one.
Today : always connected / interconnected people
Tomorrow : the internet of things
The Internet of Things (IoT) refers to uniquely identifiable objects and their interconnection on internet, as well as their automatic exchange of information with third party services.
Consumerization : new information technologies emerge first in the consumer market and then spread into businesses
This is a change compared to the previous situation
Companies used to have better servers/desktop/applications/... than those employees could buy at home
Now, new solutions emerge every month : companies can't keep up
New trend : employees are hired with their devices and their applications
BYOD trend : employees are more comfortable and more efficient with their own devices
Same power in an iPad now than in a Cray a few years back
This consumerization can be found in infrastructures too and is an enabler for the consumer market
A direct consequence of the consumerization: use of a mix of professional and personnal tools by employees (Office Suite, Gmail, Google+, Twitter, Facebook, Dropbox, evernote, ...)
Nowadays several companies are still blocking acccess to these tools from their employees (private banks). Tomorrow, that won’t be possible anymore.
People are used to be connected all the time, with highly efficient devices on highly responsive services, everywhere and for all kind of uses.
Global sales of PCs never really exploded.
On the other hand, Global sales of smartphones and tablettes explodes.
Global Mobile traffic went from 1% in 2009 to 4% in 2010 and 12% in 2012.
Today it reaches 30%.
In India, the wired telecommunication infrastructure could never be developed as it has been in Europe or in the US.
There, the mobile traffic already exceeded the Desktop traffic in 2012.
In 2015, over 3 billion people will be connected all the time, everywhere and for all kind of uses.
3 billion people connected in 2015.
But let’s consider something else : the Internet of Things : IoT
The internet of thing is the coming big thing !
Gartner : 26 billion devices on the Internet of Things by 2020.
ABI Research : 30 billion devices will be wirelessly connected to the Internet of Things by 2020.
Cars, watches, fridges, cameras, whole houses,
Internet of Everything “Cisco defines the Internet of Everything (IoE) as bringing together people, process, data, and things to make networked connections more relevant and valuable than ever before-turning information into actions that create new capabilities, richer experiences, and unprecedented economic...”
The Internet of Everything is the coming evolution from the interconnection of people and objects, always, all the time, everywhere and for all kind of uses.
Since we started estimating and measuring the amount of produced data until 2003, 5 exabytes (5 billions gigabytes) have been produced.
In 2011, that quantity was generated in 2 days (think of facebook, twitter, google searches logs, financial transaction logs, etc.)
In 2014, this quantity is generated in 10 minutes.
Not only do we generate more and more data
We have the means and the technology to analyze, exploit and mine it and extract meaningful business insights
The data generated by the company’s own systems can be a very interesting source if information regarding customer behaviours, profiles, trends, desires, etc.
But also external data, facebook, twitter logs, etc.
Twitter story : Uber car transportation system in Paris. A driver has refused to carry a customer because the customer was gay. That customer twitted his misadventure. The driver got excluded by Uber only a few hours later.
Instead of harming Uber’s reputation, the story rather gave it credit.
Just an example on how a company can get significant advantages by monitoring social network feeds
Data is produced absolutely everywhere !
Satelites is an intersting example to underline this « everywhere » aspect BUT
Think of
-Facebook / Twitter / Linkedin -> on internet
-Financial markets and transactions -> in financial institutions and on market places
-Cash distributors / payment card transactions -> everywhere in the world
Or event think of
-Planes and train traffic -> Electronically monitored
-> monitoring data is published
-Sattelites are in space here but even underground there is data produced: NYC city, London Paris subway -> electronicaly monitored
-> data is published as well
Data is produced absolutely everywhere and all the time
For a long time, the increasing volume of data to be handled has not been an issue
The volume of data rises, the number of user rises
The processing abilities rise as well, sometimes even more
See the Moore low above
This model has hold for a very long time.
The cost are going down, the computing capacities are rising, one simply needs to buy a new machine to absorb the load increase.
This is especially true in the mainframe
There wasn’t even any need to make the architecture of the systems (COBOL, etc.) evolve for 30 years
Even outside the mainframe world
The architecture patterns and styles we are using in the operational IS world haven’t really evolve for the last 15 years
Despite new technologies such as Web, Web 2.0, Java, etc. of course
I’m just speaking about architecture and styles
The analytical systems architecture hasn’t evolve for the last 20 years
So everything’s fine ?
No !
As we’ll see, at least two problems emerged relatively recently
1st concern : the throughput
We are able to store more and more data, no problem
Yet we are more and more unable to manipulate this data efficiently
Specifically, fetching all the data on a computation machine to process it is becoming more and more difficult
The revolution came from the web giants. They had to find technical answers to business challenges like :
GGL : Index the whole web, and keep a response time to any below one second - or how to keep the search free for the user ?
LINK : understand how millions of users use their website ?
AMZ : how to build a product recommendation engine for millions of customers, on millions of products ?
EBAY : how to do a search in ebay ads, even with misspelling ?
One challenge : how to handle the massive computation needs / massive amount of data ?
-> New architecture and paradigms are required
3 ideas …
4 classes of grid architecture
4 classes of grid architecture
4 classes of grid architecture
4 classes of grid architecture
4 classes of grid architecture
This is an overview of what is currently investigated in financial companies regarding big data technologies and private banking use cases
Investment research
Various applications all oriented towards trading, porfolio simulation, market research or development / testing of investment strategies / ideas.
Customer knowledge
Covers everything aaround the Customer base and CRM analysis such as
Know Your Customer – KYC - concerns
Customer profiling
Customer analysis
Customer documents (emails / calls / contracts) analysis
Risk Management
Uses cases are mostly oriented towards computing risk metrics and consolidated metrics more efficiently
Quicker / real time
Real-time monitoring
Providing consolidated risk dashboards
Compliance and monitoring
Uses cases essentialy covers various fraud detection issues and compliance assertions
- Pre / post trade compliance verifications
Communications monitoring
AML
On a vu les terrains de jeux sur lesquels on peut jouer !!
VOICI MAINTENANT UN FOCUS SUR 4 CAS PARTICULIERS !!!!!!
TODO : les gains attendus !!! (typiquement ROI)
TODO : les slides architecture en annexe !!!
Pourquoi les géeants du web et ouverture de la solution à l’extérieur, google moteur de recherche, commodities
Logiciel : un peu débordé par la situation suiveur
No real-time / intraday values and computations are simply not possible
Intraday computation are implemented in operational IS
Some put everything in memory -> Huge cluster required / reload time issue
Data is the new Black Gold
A problem in several businesses today : lack of business insight => difficulty to make sound decisions / follow the pace of today’s market
Big Data : a tremendous upportunity to drill down and tap into these critical insights
Hence the comparison with crude oild
=> We’ll try to prove this statement in the following presentation
As introduction -> a sensibilisation to
The Data problem
The dimensions of data (all the time / everywhere)
The new emerging patterns around Data in Information Systems
1. Big Data as cost killer or enhancer
-> Make things we are already doing either cheaper or better
Various opportunities following this can be found simply by asking ourselves
“what compromise have we made at a functional level within the Information System due to limitations of the underlying technology”
One example : archiving
Several banks are getting rid of the oldest account activity trails or the oldest financial transactions from the Operational information System and store them in archive databases.
This is required to reduce the size of the Operational Database and keep it efficient.
What if that wouldn’t be make sense any more and information from 20 years ago would still be available in the Live database ?
2. Big Data as a way to widen the field of possibilities
This time we ask ourselves “what functional stake / requirement / or idea did we give up on because of limitations of the underlying technology” ?
We’ll see some examples soon…