DISCOVER
UNDERSTAND
EVOLVE
Presenting a use case taking unstructured data into OCR, Entity Extraction, Case Management and simple to use Visualisations.
BDWW17 London - Steve Bradbury, GRSC - Big Data to the Rescue: A Fraud Case S...Big Data Week
In 2003, three criminals were jailed for nine years following the largest Card Fraud Case in Europe with a publicised loss to Card Companies of £2.21 million.
Find out how they were caught back then and how Big Data Technologies would have brought them to justice quicker.
Steve Bradbury was the Prime Investigator and Evidence Provider which lead to the convictions using data from Floppy Discs!
The document discusses two investigative journalism case studies where Neo4j was used to analyze large leaked datasets and reveal connections between people, entities, and accounts. In the first case study, Neo4j helped journalists expose corruption related to the Panama Papers leak. In the second case study, Neo4j helped journalists win a Pulitzer Prize for their investigation of the Paradise Papers leak.
Web Fraud has become a big issue/. Organisations need to quickly step up their effort to protect their assets. Old school BI and DWH does not work anymore. These slides contains some thoughts on what could be done.
Digital Transformation and the Journey to a Highly Connected EnterpriseNeo4j
Jeff Morris, Head of Product Marketing at Neo4j, covers the rise of connections in data and why a forward thinking enterprise must embrace the connections in their data in order to survive.
The document provides an agenda for a Graph Tour presentation which includes introductions, an overview of graphs and Neo4j, use cases in government and finance, digital transformation and the future. Common themes of connectedness in data are discussed. Neo4j is described as an enterprise-grade native graph platform that enables storing, revealing and querying data relationships.
Learn how to research and utilize Big Data to tell the story of your community and ultimately attract companies, talent, and capital to your front door.
SAP Forum Ankara 2017 - "Verinin Merkezine Seyahat"MDS ap
The document discusses digital transformation and the journey to data-driven insights. It provides an overview of data types and how data has grown exponentially over time. Both structured and unstructured data are discussed, with examples of semi-structured data like emails and reports. The value of understanding all data sources is emphasized for gaining competitive advantages through analytics. New technologies like complex event processing are enabling lightning-fast action based on diverse data. Finally, the presentation introduces SAP HANA Vora for bridging the divide between enterprise and big data systems to facilitate precision decision making.
BDWW17 London - Steve Bradbury, GRSC - Big Data to the Rescue: A Fraud Case S...Big Data Week
In 2003, three criminals were jailed for nine years following the largest Card Fraud Case in Europe with a publicised loss to Card Companies of £2.21 million.
Find out how they were caught back then and how Big Data Technologies would have brought them to justice quicker.
Steve Bradbury was the Prime Investigator and Evidence Provider which lead to the convictions using data from Floppy Discs!
The document discusses two investigative journalism case studies where Neo4j was used to analyze large leaked datasets and reveal connections between people, entities, and accounts. In the first case study, Neo4j helped journalists expose corruption related to the Panama Papers leak. In the second case study, Neo4j helped journalists win a Pulitzer Prize for their investigation of the Paradise Papers leak.
Web Fraud has become a big issue/. Organisations need to quickly step up their effort to protect their assets. Old school BI and DWH does not work anymore. These slides contains some thoughts on what could be done.
Digital Transformation and the Journey to a Highly Connected EnterpriseNeo4j
Jeff Morris, Head of Product Marketing at Neo4j, covers the rise of connections in data and why a forward thinking enterprise must embrace the connections in their data in order to survive.
The document provides an agenda for a Graph Tour presentation which includes introductions, an overview of graphs and Neo4j, use cases in government and finance, digital transformation and the future. Common themes of connectedness in data are discussed. Neo4j is described as an enterprise-grade native graph platform that enables storing, revealing and querying data relationships.
Learn how to research and utilize Big Data to tell the story of your community and ultimately attract companies, talent, and capital to your front door.
SAP Forum Ankara 2017 - "Verinin Merkezine Seyahat"MDS ap
The document discusses digital transformation and the journey to data-driven insights. It provides an overview of data types and how data has grown exponentially over time. Both structured and unstructured data are discussed, with examples of semi-structured data like emails and reports. The value of understanding all data sources is emphasized for gaining competitive advantages through analytics. New technologies like complex event processing are enabling lightning-fast action based on diverse data. Finally, the presentation introduces SAP HANA Vora for bridging the divide between enterprise and big data systems to facilitate precision decision making.
The document discusses the challenges of big data and data quality. It defines big data as large volumes of data that are difficult to process using traditional database systems. Big data comes from various sources like social media, machines, and open data. It highlights that poor data quality will undermine the value of big data investments, and that data quality foundations are needed to build successful big data and analytics programs. Effective data integration, profiling, standardization, and governance are critical to addressing the data quality imperative of big data.
Talking about Big Data generates a lot of questions; however, most of the focus is on the technologies and skills required to collect and store this volume of information as opposed to the insight that companies need to derive from it. What factors should organizations consider in order to ensure that they are capitalizing on their investments with these technologies? How do you break through business silos to enable sharing of data to increase organizational value? Leveraging his cross-industry experience at companies like The Walt Disney Company, Travelers Insurance and Demand Media, Brendan Aldrich will discuss the question of “big value” with industry examples and a particular focus on his current work to deploy a “data democracy” within the City Colleges of Chicago.
Session Discovery Topics:
• Big value - keeping an eye on the forest (assumptions, judgment and bias)
• Data democracy - increasing productivity with data transparency and open access
In part 2, EMSI's John Pernsteiner will discuss how you can apply the same principles used successfully in Nevada in your region. In particular, John will focus on the three elements of economic development research—data collection, analysis, and dissemination.
Big data is very large data that is difficult to process using traditional methods. It is characterized by high volume, velocity, and variety. Examples of real-life big data implementations include using social media to understand customer behavior, tracking social media for marketing campaigns, and analyzing medical data to predict readmissions. Challenges include integrating diverse data sources and ensuring ethical access. Common techniques for processing big data are parallel database management systems and MapReduce frameworks like Hadoop.
Big data describes large and complex data sets that require new tools and techniques to analyze. It is generated from many sources like internet usage, social media, sensors, and business transactions. There are three characteristics of big data - volume, velocity, and variety. To analyze big data, open source frameworks like Hadoop use parallel processing across clusters of computers. Analyzing big data can provide competitive advantages to companies and governments by enabling more targeted products and predictive actions.
2017-10-05 Mitigating Cybersecurity and Cyber Fraud risk in Your OrganizationRaffa Learning Community
An examination of ever growing cyber threats which continue to develop and successfully execute cyber attacks and fraud scams, which cost businesses billions of dollars globally. This session will step through different current and emerging cyber attacks and cyber fraud scenarios, and then discuss how basic but effective security controls can help to significantly reduce the risks.
Data science is the study of data to extract meaningful insights for business. It is a multidisciplinary approach that combines principles and practices from the fields of mathematics, statistics, artificial intelligence, and computer engineering to analyze large amounts of data. This analysis helps data scientists to ask and answer questions like what happened, why it happened, what will happen, and what can be done with the results.
Why is data science important?
Data science is important because it combines tools, methods, and technology to generate meaning from data. Modern organizations are inundated with data; there is a proliferation of devices that can automatically collect and store information. Online systems and payment portals capture more data in the fields of e-commerce, medicine, finance, and every other aspect of human life. We have text, audio, video, and image data available in vast quantities.
This document discusses data science and provides an overview of the data science ecosystem. It defines data science as an interdisciplinary field that uses scientific methods and processes to extract knowledge and insights from data. It discusses how data science relates to fields like artificial intelligence, machine learning, and big data. The document also provides examples of data science applications in domains like fraud detection, recommendation systems, healthcare, and sports. It outlines the core research issues in data science like data management, modeling, visualization, and ensuring data quality and trustworthiness.
This document discusses data science and provides an overview of the data science ecosystem. It defines data science as an interdisciplinary field that uses scientific methods and processes to extract knowledge and insights from data. It discusses how data science relates to fields like artificial intelligence, machine learning, and big data. The document also provides examples of data science applications in domains like fraud detection, recommendation systems, healthcare, and sports. It outlines the core research issues in data science like data management, modeling, visualization, and ensuring data quality and trustworthiness.
This document discusses data science and provides an overview of the data science ecosystem. It defines data science as an interdisciplinary field that uses scientific methods and processes to extract knowledge and insights from structured and unstructured data. It discusses key aspects of data science including its relationships to big data and artificial intelligence. The document also provides examples of data science applications in domains such as fraud detection, recommendation systems, healthcare, and sports. Additionally, it outlines core research issues in data science like making data trustable and usable, modeling and analysis, data visualization, and big data management.
The age of data - Putting responsible data into practicePhuong Vo An
The document discusses the exponential growth of data in recent years, with 2.5 quintillion bytes of data created every day and 90% of data created in just the last two years. New digital technologies offer opportunities through data collection and analysis but also pose significant risks around security, privacy, and power imbalances. Responsible data management is important and involves treating people with respect, ensuring data is collected and used ethically and for people's best interests, and having a culture that manages risks around data.
Cyber attacks have been hitting the headlines for years; but in spite of the risks, the reputational damage and the rising cost of fines, there is still an endless stream of businesses being exposed for security failings.
The scale of the problem is vast: Accenture’s recent 2016 Global Security Report highlighted “an astounding level of breaches” with the organisations surveyed facing more than 80 targeted attacks every year, of which a third were successful. Much has been made of the evolving threat landscape and increasing sophistication of attacks. But whilst there is evidence to support the growing complexity of the challenge, all too often the analysis of these high-profile attacks determines basic, foundational security principles were ignored.
Some commentators argue that the persistence of failings is a direct reflection of organisational priorities, and that while businesses may talk a good game, security is not yet given the attention that it requires at board level. This leaves CISOs and IT leaders fighting a losing battle to secure adequate attention and investment for an area of the business which does not generate revenue.
This conference will look at raising security standards across the business, exploring some of the most persistent problems from IT infrastructure to staff engagement. Amidst a backdrop of perpetual media hysteria, turbulent markets and looming regulatory change, it can prove difficult to establish a coherent picture of the threat, never mind what action to take. The conference will help contextualise the challenging landscape and discuss how to deliver meaningful improvements and end to end organisational resilience.
The document discusses IT security best practices for organizations. It covers assessing security vulnerabilities through vulnerability mapping and penetration testing. Common vulnerabilities discussed include open ports, outdated software and antivirus, and weak authentication processes. The document also covers privacy laws and data breach notification requirements. Maintaining strong security requires treating it as a business decision by understanding risk and prioritizing remediation of the most serious issues.
With the ever-increasing threat of viruses, security breaches, and cyber theft, it is important to understand the basics of network and internet security. In this session, we will discuss the following:
*How to pass the security portion of your audit
*Protecting your hardware
*Security in the cloud
*Privacy Laws
This class is beneficial to IT, Operations, and Administrative professionals.
Smart Data Module 1 introduction to big and smart datacaniceconsulting
This document provides an overview of big and smart data. It defines big data as large volumes of structured, unstructured, and semi-structured data that is difficult to manage and process using traditional databases. It discusses how big data becomes smart data through analysis and insights. Examples of smart data applications are also provided across various industries like retail, healthcare, transportation and more. The document emphasizes that in order to start smart with data, companies need to review their existing data, ask the right questions, and form actionable insights rather than just conclusions.
This document discusses big data and data mining. It defines big data as large volumes of structured and unstructured data that are difficult to process using traditional techniques due to their size. It outlines the 4 Vs of big data: volume, velocity, variety, and veracity. The proposed system would use distributed parallel computing with Hadoop to identify relationships in huge amounts of data from different sources and dimensions. It discusses challenges of big data like data location, volume, privacy, and gaining insights. Solutions involve parallel programming, distributed storage, and access restrictions.
If I want a perfect cyberweapon, I'll target ERPERPScan
ERP Systems are widely used in Oil and Gas, Manufacturing, Logistics, Financials
Nuclear, Retail, Telecommunication and other industries. All mission-critical data are stored in ERP Systems, so attacks against them may result in Espionage, Sabotage and Fraud.
The presentation gives examples of real and potential attacks and describes important details of ERP Security.
Alexander Polyakov, CTO of ERPScan, presented this talk at RSA Conference Europe 2013.
The Evolution of Data and New Opportunities for AnalyticsSAS Canada
BIG DATA IS EVERYWHERE!
Today we produce around five Exabyte every two days … and this is accelerating.
The intelligent devices, what we call the internet of things, promise to be the next big explosion.
Explore evolution of data and new opportunities for analytics.
www.sas.com
BDW17 London - Totte Harinen, Uber - Why Big Data Didn’t End Causal InferenceBig Data Week
Ten years ago there were rumours of the death of causal inference. Big data was supposed to enable us to rely on purely correlational data to predict and control the world. In this talk, I argue that the rumours were strongly exaggerated. Causal inference is becoming increasingly relevant thanks to improvements in inference methods and–ironically–the availability of data. Far from becoming marginalised, causal inference is today more relevant than it’s ever been.
More Related Content
Similar to BDW17 London - Steve Bradbury - GRSC - Making Sense of the Chaos of Data
The document discusses the challenges of big data and data quality. It defines big data as large volumes of data that are difficult to process using traditional database systems. Big data comes from various sources like social media, machines, and open data. It highlights that poor data quality will undermine the value of big data investments, and that data quality foundations are needed to build successful big data and analytics programs. Effective data integration, profiling, standardization, and governance are critical to addressing the data quality imperative of big data.
Talking about Big Data generates a lot of questions; however, most of the focus is on the technologies and skills required to collect and store this volume of information as opposed to the insight that companies need to derive from it. What factors should organizations consider in order to ensure that they are capitalizing on their investments with these technologies? How do you break through business silos to enable sharing of data to increase organizational value? Leveraging his cross-industry experience at companies like The Walt Disney Company, Travelers Insurance and Demand Media, Brendan Aldrich will discuss the question of “big value” with industry examples and a particular focus on his current work to deploy a “data democracy” within the City Colleges of Chicago.
Session Discovery Topics:
• Big value - keeping an eye on the forest (assumptions, judgment and bias)
• Data democracy - increasing productivity with data transparency and open access
In part 2, EMSI's John Pernsteiner will discuss how you can apply the same principles used successfully in Nevada in your region. In particular, John will focus on the three elements of economic development research—data collection, analysis, and dissemination.
Big data is very large data that is difficult to process using traditional methods. It is characterized by high volume, velocity, and variety. Examples of real-life big data implementations include using social media to understand customer behavior, tracking social media for marketing campaigns, and analyzing medical data to predict readmissions. Challenges include integrating diverse data sources and ensuring ethical access. Common techniques for processing big data are parallel database management systems and MapReduce frameworks like Hadoop.
Big data describes large and complex data sets that require new tools and techniques to analyze. It is generated from many sources like internet usage, social media, sensors, and business transactions. There are three characteristics of big data - volume, velocity, and variety. To analyze big data, open source frameworks like Hadoop use parallel processing across clusters of computers. Analyzing big data can provide competitive advantages to companies and governments by enabling more targeted products and predictive actions.
2017-10-05 Mitigating Cybersecurity and Cyber Fraud risk in Your OrganizationRaffa Learning Community
An examination of ever growing cyber threats which continue to develop and successfully execute cyber attacks and fraud scams, which cost businesses billions of dollars globally. This session will step through different current and emerging cyber attacks and cyber fraud scenarios, and then discuss how basic but effective security controls can help to significantly reduce the risks.
Data science is the study of data to extract meaningful insights for business. It is a multidisciplinary approach that combines principles and practices from the fields of mathematics, statistics, artificial intelligence, and computer engineering to analyze large amounts of data. This analysis helps data scientists to ask and answer questions like what happened, why it happened, what will happen, and what can be done with the results.
Why is data science important?
Data science is important because it combines tools, methods, and technology to generate meaning from data. Modern organizations are inundated with data; there is a proliferation of devices that can automatically collect and store information. Online systems and payment portals capture more data in the fields of e-commerce, medicine, finance, and every other aspect of human life. We have text, audio, video, and image data available in vast quantities.
This document discusses data science and provides an overview of the data science ecosystem. It defines data science as an interdisciplinary field that uses scientific methods and processes to extract knowledge and insights from data. It discusses how data science relates to fields like artificial intelligence, machine learning, and big data. The document also provides examples of data science applications in domains like fraud detection, recommendation systems, healthcare, and sports. It outlines the core research issues in data science like data management, modeling, visualization, and ensuring data quality and trustworthiness.
This document discusses data science and provides an overview of the data science ecosystem. It defines data science as an interdisciplinary field that uses scientific methods and processes to extract knowledge and insights from data. It discusses how data science relates to fields like artificial intelligence, machine learning, and big data. The document also provides examples of data science applications in domains like fraud detection, recommendation systems, healthcare, and sports. It outlines the core research issues in data science like data management, modeling, visualization, and ensuring data quality and trustworthiness.
This document discusses data science and provides an overview of the data science ecosystem. It defines data science as an interdisciplinary field that uses scientific methods and processes to extract knowledge and insights from structured and unstructured data. It discusses key aspects of data science including its relationships to big data and artificial intelligence. The document also provides examples of data science applications in domains such as fraud detection, recommendation systems, healthcare, and sports. Additionally, it outlines core research issues in data science like making data trustable and usable, modeling and analysis, data visualization, and big data management.
The age of data - Putting responsible data into practicePhuong Vo An
The document discusses the exponential growth of data in recent years, with 2.5 quintillion bytes of data created every day and 90% of data created in just the last two years. New digital technologies offer opportunities through data collection and analysis but also pose significant risks around security, privacy, and power imbalances. Responsible data management is important and involves treating people with respect, ensuring data is collected and used ethically and for people's best interests, and having a culture that manages risks around data.
Cyber attacks have been hitting the headlines for years; but in spite of the risks, the reputational damage and the rising cost of fines, there is still an endless stream of businesses being exposed for security failings.
The scale of the problem is vast: Accenture’s recent 2016 Global Security Report highlighted “an astounding level of breaches” with the organisations surveyed facing more than 80 targeted attacks every year, of which a third were successful. Much has been made of the evolving threat landscape and increasing sophistication of attacks. But whilst there is evidence to support the growing complexity of the challenge, all too often the analysis of these high-profile attacks determines basic, foundational security principles were ignored.
Some commentators argue that the persistence of failings is a direct reflection of organisational priorities, and that while businesses may talk a good game, security is not yet given the attention that it requires at board level. This leaves CISOs and IT leaders fighting a losing battle to secure adequate attention and investment for an area of the business which does not generate revenue.
This conference will look at raising security standards across the business, exploring some of the most persistent problems from IT infrastructure to staff engagement. Amidst a backdrop of perpetual media hysteria, turbulent markets and looming regulatory change, it can prove difficult to establish a coherent picture of the threat, never mind what action to take. The conference will help contextualise the challenging landscape and discuss how to deliver meaningful improvements and end to end organisational resilience.
The document discusses IT security best practices for organizations. It covers assessing security vulnerabilities through vulnerability mapping and penetration testing. Common vulnerabilities discussed include open ports, outdated software and antivirus, and weak authentication processes. The document also covers privacy laws and data breach notification requirements. Maintaining strong security requires treating it as a business decision by understanding risk and prioritizing remediation of the most serious issues.
With the ever-increasing threat of viruses, security breaches, and cyber theft, it is important to understand the basics of network and internet security. In this session, we will discuss the following:
*How to pass the security portion of your audit
*Protecting your hardware
*Security in the cloud
*Privacy Laws
This class is beneficial to IT, Operations, and Administrative professionals.
Smart Data Module 1 introduction to big and smart datacaniceconsulting
This document provides an overview of big and smart data. It defines big data as large volumes of structured, unstructured, and semi-structured data that is difficult to manage and process using traditional databases. It discusses how big data becomes smart data through analysis and insights. Examples of smart data applications are also provided across various industries like retail, healthcare, transportation and more. The document emphasizes that in order to start smart with data, companies need to review their existing data, ask the right questions, and form actionable insights rather than just conclusions.
This document discusses big data and data mining. It defines big data as large volumes of structured and unstructured data that are difficult to process using traditional techniques due to their size. It outlines the 4 Vs of big data: volume, velocity, variety, and veracity. The proposed system would use distributed parallel computing with Hadoop to identify relationships in huge amounts of data from different sources and dimensions. It discusses challenges of big data like data location, volume, privacy, and gaining insights. Solutions involve parallel programming, distributed storage, and access restrictions.
If I want a perfect cyberweapon, I'll target ERPERPScan
ERP Systems are widely used in Oil and Gas, Manufacturing, Logistics, Financials
Nuclear, Retail, Telecommunication and other industries. All mission-critical data are stored in ERP Systems, so attacks against them may result in Espionage, Sabotage and Fraud.
The presentation gives examples of real and potential attacks and describes important details of ERP Security.
Alexander Polyakov, CTO of ERPScan, presented this talk at RSA Conference Europe 2013.
The Evolution of Data and New Opportunities for AnalyticsSAS Canada
BIG DATA IS EVERYWHERE!
Today we produce around five Exabyte every two days … and this is accelerating.
The intelligent devices, what we call the internet of things, promise to be the next big explosion.
Explore evolution of data and new opportunities for analytics.
www.sas.com
Similar to BDW17 London - Steve Bradbury - GRSC - Making Sense of the Chaos of Data (20)
BDW17 London - Totte Harinen, Uber - Why Big Data Didn’t End Causal InferenceBig Data Week
Ten years ago there were rumours of the death of causal inference. Big data was supposed to enable us to rely on purely correlational data to predict and control the world. In this talk, I argue that the rumours were strongly exaggerated. Causal inference is becoming increasingly relevant thanks to improvements in inference methods and–ironically–the availability of data. Far from becoming marginalised, causal inference is today more relevant than it’s ever been.
BDW17 London - Rita Simoes, Boehringer Ingelheim - Big Data in Pharma: Sittin...Big Data Week
As far as data is concerned, Pharmaceutical Companies have always been clear-sighted and assertive on what insights to get from it, how, and what to do about it. And then the Big Data Era came in, with its frantic pace, transforming multiple industries all around but, for a number of reasons (privacy and data protection issues on top but not alone) keeping the Pharma Industry behind. How to run the extra mile to keep up with the powerful changes Big Data brings along has become a major concern. Strategic opportunities seem to be around the corner. Is the time to bridge gaps finally here?
BDW17 London - Mick Ridley, Exterion Media & Dale Campbell , TfL - Transformi...Big Data Week
Hello London, the ground-breaking media partnership between Transport for London (TfL) and Exterion Media, gives new opportunities for brands to talk to the London audience in innovative ways and generates vital revenue for London’s transport network.
TfL and Exterion have been working together in the Hello London partnership for a year. Part of the collaboration was around the utilisation of data collected by TfL to better inform advertising investment decisions.
This has led to ground-breaking work in the Out-of-Home advertising sector and the first example of this is Taps Segmentation. Developed by the TfL Data Science team, it allows Exterion to understand demographic patterns at stations based on aggregated contactless and Oyster card usage. This de-personalised data can be analysed for different times of the day and is a game changer – allowing Exterion to rethink how both their classic and digital inventory can be packaged and tailored specifically for clients.
The presentation will cover how TfL and Exterion have collaborated, the approach used by TfL and how Exterion are using it to generate revenue which is reinvested in the transport network.
BDW17 London - Abed Ajraou - First Utility - Putting Data Science in your Bus...Big Data Week
Data Science is now well established in our businesses, and everyone considers data as a key asset and critical for our competitiveness.
However, Data Science is not easy to manage, very often projects failed and the investment made is not seeing as profitable.
The aim of this talk is to share the knowledge in different areas:
* avoid classical mistakes in Data Science
* use the right Big Data technology
* apply the right methodology
* make the Data Science team more efficient
BDW17 London - Andy Boura - Thomson Reuters - Does Big Data Have to Mean Big ...Big Data Week
The document discusses some of the risks associated with big data, including the risk of data breaches getting more costly as data volumes and repositories increase. It notes that smaller breaches involving 10,000 to 100,000 records on average cost hundreds per record, while mega-breaches of millions of records can cost billions and be in the range of pounds per record. The main sources of risk are identified as user error, system glitches, and attacks, with malicious attacks being the costliest. It provides some recommendations around applying security controls like access management and automation while also considering dependencies and maintaining good data hygiene.
BDW17 London - Tom Woolrich, Financial Times - What Does Big Data Mean for th...Big Data Week
Content:
1. A brief history of the FT
2. What does Big Data mean to the FT?
3. The benefits of Big Data & how we use it
4. How we do it
5. What’s next for us?
BDW17 London - Andrew Fryer, Microsoft - Everybody Needs a Bit of Science in ...Big Data Week
Science is a way of thinking more than a body of knowledge. It involves asking why, how, and what questions. Artificial intelligence has advanced due to cloud computing, big data, and open source approaches which have enabled data-driven decision making and rapid learning from experiences. There are still issues around creativity, ethics, and replacing human experience with technologies.
BDW16 London - Alex Bordei, Bigstep - Building Data Labs in the CloudBig Data Week
Building Data Labs in the Cloud summarizes how to build data labs in the cloud by connecting on-premise services through VPN or targeted firewalls, integrating identity services between on-premise and cloud realms, enabling single sign-on with two-factor authentication, using encryption with cloud or on-premise HSMs, leveraging Spark for data science, SQL, ETL, machine learning and graph processing, adopting a multi-context architecture for maintenance and efficiency, and ensuring real-time systems provide performance, stability, serviceability and fault tolerance.
BDW16 London - William Vambenepe, Google - 3rd Generation Data PlatformBig Data Week
1. The document discusses Google Cloud's 3rd generation data platform and services for managing large-scale data and analytics workloads. It focuses on managed services that allow users to focus on insights rather than infrastructure maintenance.
2. The platform includes services for data ingestion, processing, storage and analytics including Cloud Pub/Sub, Dataflow, BigQuery, Dataproc, Bigtable and Cloud Storage. It aims to provide a serverless platform with auto-optimized usage and pay per use pricing model.
3. Over 15 years Google has developed technologies for tackling big data problems including papers, open source projects and cloud products. Core components of their data platform are discussed including the Beam programming model and Dataflow for unified
BDW16 London - Scott Krueger, skyscanner - Does More Data Mean Better Decisio...Big Data Week
We have seen vast improvements to data collection, storage, processing and transport in recent years. An increasing number of networked devices are emitting data and all of us are preparing to handle this wave of valuable data.
Have we, as data professionals, been too focused on the technical challenges and analytical results?
What about the data quality? Are we confident about it? How can we be sure we are making good decisions?
We need to revisit methods of assessing data quality on our modernized data platforms. The quality of our decision making depends on it.
BDW16 London - Nondas Sourlas, Bupa - Big Data in HealthcareBig Data Week
The document discusses Bupa's use of analytics in healthcare, including risk modelling and care management, and referral management. For risk modelling and care management, Bupa uses predictive modelling to identify high-risk patients for targeted outreach programs, which have led to reductions in outpatient visits, tests, and surgical procedures, saving 9-10% in care costs. For referral management, Bupa profiles over 18,000 consultants based on claims data to guide over 700,000 pre-authorizations, achieving estimated healthcare savings of 9-11% of guided spend.
BDW16 London - John Callan, Boxever - Data and Analytics - The Fuel Your Bran...Big Data Week
Unsuccessful marketing campaigns are leaving customers disgruntled, making them 40% less likely to return. Companies are casting aside useful data that can provide further insights into better products/better connections with customers. John Callan, VP of Marketing at Boxever will discuss how AI can change how businesses predict trends, reduce risks, and improve efficiency.
Audience will:
Gain expert-level understanding of data and machine learning that’s used in today’s market
Identify successful ways companies use machine learning to target customers with personalized content
Learn from major airlines use-cases to skillfully target customers and show them exactly what they want to see.
BDW16 London - John Belchamber, Telefonica - New Data, New Strategies, New Op...Big Data Week
Through the experiences of supporting a Multi-Country roll out using data to drive more effective Network capability, we will explain how we have:
Created new internal capability to support local countries, developed skill sets in the country and provided technical infrastucture, algorithms and visualisations to drive the data culture and big data strategies across Telefonica business units.
Through this framework, we will explain how to blend technical and business needs to maximise the benefits and drive better business performance.
BDW16 London - Deenar Toraskar, Think Reactive - Fast Data Key to Efficient C...Big Data Week
The Basel Committee on Banking Supervision (BCBS) and local regulators has been focussed on making banks more safe and resilient. A whole raft of new capital charges and constraints on liquidity and leverage have been introduced: Basel II.5, Basel III, Dodd-Frank, FRTB (“Basel IV”), etc. These have significantly increased the risk data management capabilities banks must have—capabilities that only big data tools can provide.
This talk will cover the challenges of building a position-aware risk management platform that properly aggregates all intra-day trading activity, monitors exposures and risk. The fast data stack can help banks create such a platform and provide a robust foundation to achieve compliance and, ultimately a significant competitive edge by making efficient use of capital.
BDW16 London - Jonny Voon, Innovate UK - Smart Cities and the Buzz Word BingoBig Data Week
With predictions from the United Nations that 66 percent of the world population, including an extra 2.5 billion people, living in urban areas our cities are getting extra attention. If we want to avoid dystopian megacities of the future, then we must begin the technology transformation in our cities now.
BDW16 London - Josh Partridge, Shazam - How Labels, Radio Stations and Brand...Big Data Week
“At Shazam, we think data can be beautiful and stunningly inspiring. The pictures we paint with our data tell stories about changing culture, tastes, and shared discoveries. A truly great new song can sweep across the globe in a wave of Shazams that transcends politics, language, or religion”, Greg Glanday, Chief Revenue Officer at Shazam.
This presentation will offer the audience a few examples of how they can use the data from Shazam to get fantastic insight into the consumers` preferences, and how to take that insight and apply it to a brand.
Giving 3 or 4 great examples of what we do at Shazam, anyone in the audience can understand what this data means, really see this data and then be able to leverage it to make smart marketing decisions.
Main takeaway: a clear understanding of what Shazam data is and how brands can use it.
BDW16 London - Wael Elrifai, Pentaho - Big Data-Driven InnovatiomBig Data Week
This presentation will explore data gathering techniques, tools, and analysis processes in the business innovation process. By way of example, the presentation will outline the stages of planning, designing, and delivering behind one of today’s most popular business innovation use cases for IoT – a predictive maintenance system. It will also reveal the different areas in which businesses gain value (and cost savings) by automating the processes of data engineering and data discovery.
BDW16 London - Vojta Rocek, Trologic - Challenging Big DataBig Data Week
Many big data projects ultimately make life harder, not easier, for the end user. Businesses everywhere are swamped with data, decisions are slower and still based on gut feeling, and moreover, 95 % of company data is noise.
If you are going through the same pains and wish for a world where your BI can answer what are the top opportunities right now, which market segment has the greatest potential, and what is being done about the top five disasters, you should come listen this talk.
Vojta will briefly demo a solution that brings all these benefits. He will show you how you can cut costs in hospitals, improve profitability in banking and insurance and drive revenue in retail and FMCG.
Digital Banking in the Cloud: How Citizens Bank Unlocked Their MainframePrecisely
Inconsistent user experience and siloed data, high costs, and changing customer expectations – Citizens Bank was experiencing these challenges while it was attempting to deliver a superior digital banking experience for its clients. Its core banking applications run on the mainframe and Citizens was using legacy utilities to get the critical mainframe data to feed customer-facing channels, like call centers, web, and mobile. Ultimately, this led to higher operating costs (MIPS), delayed response times, and longer time to market.
Ever-changing customer expectations demand more modern digital experiences, and the bank needed to find a solution that could provide real-time data to its customer channels with low latency and operating costs. Join this session to learn how Citizens is leveraging Precisely to replicate mainframe data to its customer channels and deliver on their “modern digital bank” experiences.
Conversational agents, or chatbots, are increasingly used to access all sorts of services using natural language. While open-domain chatbots - like ChatGPT - can converse on any topic, task-oriented chatbots - the focus of this paper - are designed for specific tasks, like booking a flight, obtaining customer support, or setting an appointment. Like any other software, task-oriented chatbots need to be properly tested, usually by defining and executing test scenarios (i.e., sequences of user-chatbot interactions). However, there is currently a lack of methods to quantify the completeness and strength of such test scenarios, which can lead to low-quality tests, and hence to buggy chatbots.
To fill this gap, we propose adapting mutation testing (MuT) for task-oriented chatbots. To this end, we introduce a set of mutation operators that emulate faults in chatbot designs, an architecture that enables MuT on chatbots built using heterogeneous technologies, and a practical realisation as an Eclipse plugin. Moreover, we evaluate the applicability, effectiveness and efficiency of our approach on open-source chatbots, with promising results.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
"Frontline Battles with DDoS: Best practices and Lessons Learned", Igor IvaniukFwdays
At this talk we will discuss DDoS protection tools and best practices, discuss network architectures and what AWS has to offer. Also, we will look into one of the largest DDoS attacks on Ukrainian infrastructure that happened in February 2022. We'll see, what techniques helped to keep the web resources available for Ukrainians and how AWS improved DDoS protection for all customers based on Ukraine experience
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
AppSec PNW: Android and iOS Application Security with MobSFAjin Abraham
Mobile Security Framework - MobSF is a free and open source automated mobile application security testing environment designed to help security engineers, researchers, developers, and penetration testers to identify security vulnerabilities, malicious behaviours and privacy concerns in mobile applications using static and dynamic analysis. It supports all the popular mobile application binaries and source code formats built for Android and iOS devices. In addition to automated security assessment, it also offers an interactive testing environment to build and execute scenario based test/fuzz cases against the application.
This talk covers:
Using MobSF for static analysis of mobile applications.
Interactive dynamic security assessment of Android and iOS applications.
Solving Mobile app CTF challenges.
Reverse engineering and runtime analysis of Mobile malware.
How to shift left and integrate MobSF/mobsfscan SAST and DAST in your build pipeline.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
3. Steve Bradbury – Head of Fraud and Data Division
• Over 25 Years in Data
• Fraud/ Risk experience since 1992
• Fraud experience gained at American express (12 years), HSBC (5 years), Thomas cook (5 years) Consultancy (5 years)
• Roles – Head of Fraud, Chief Data Scientist, Head of Data and MIS EMEA, Senior Technical Manager, Fraud Investigator
• SME ACROSS INVESTIGATION/ INSIGHT ENGAGEMENTS SPECIALISING IN
FRAUD, DATA KNOWLEDGE, INSIGHT AND ANALYSIS, LEADERSHIP, CLIENT
FACING DEMONSTRATIONS/ ENGAGEMENT, ASSET TRACING, CORRUPTION,
LAW ENFORCEMENT, DATA ENRICHMENT, TECHNOLOGY.
• DEVELOPED SOCIAL MEDIA INSIGHT AND REPORTING ACROSS CRIMINAL
AND TERRORISM NETWORKS USING CUTTING EDGE TECHNOLOGIES.
• RESEARCHED, PROPOSED AND DELIVERED INNOVATIVE TECHNICAL
SOLUTIONS.
• MULTIPLE GEOGRAPHICAL AND INDUSTRY EXPERIENCED – CARD, BANKING,
RETAIL, ONLINE GAMING, LAW ENFORCEMENT, GOVERNMENT,
INSURANCE, TELECOMS, DARK WEB, NATIONAL TRADING STANDARDS
SCAM CHAMPION, 419’S
• SUCCESSFUL INVESTIGATION OF LARGEST CARD FRAUD CASE IN
EUROPE (SEE MY SESSION AT 1500 IN THIS ROOM)
• DESIGN AND DELIVERY OF GLOBAL AML/ RISK REPORTING
• DESIGN AND ROLL OUT OF EUROPEAN FRAUD DECS
• REDESIGNED ALL FRAUD REPORTING FROM ANTIQUATED
MAINFRAME TO SQL
• DELIVER EXTENSIVE BI PROJECT TO AMEX JV (HARDWARE,
SOFTWARE, PEOPLE, REPORTING, DATA FEEDS)
• CREATED FIRST OF ITS KIND CUSTOMER CRISIS RECOVERY
PROGRAM
5. Discover
• OBVIOUSLY THE FIRST PART OF ANY BI ANALYTICS PROJECT IS UNDERSTANDING WHAT THE BUSINESS
OBJECTIVE IS AND WHAT DOES SUCCESS LOOK LIKE.
• SECOND TO THIS IS THE DISCOVERY PHASE –
• WHAT DATA DO YOU HAVE?
• WHAT TECHNOLOGY WILL YOU USE?
• WHAT DO YOU NEED TO KNOW ABOUT THE DATA.
CASE STUDY
ONE THE LARGEST EUROPEAN TRAVEL AGENCIES WENT INTO ADMINISTRATION.
• THE DATA –
• 26 COMPLETE PC DUMPS – 3.2TB OF DATA
• 14 FILING CABINETS
6. Discover
FIRSTLY WE NEED TO SEE WHAT THE 3.2TB OF DATA CONTAINS
THE RESULT -
1.7TB OF PERSONAL DATA – MUSIC, PHOTO’S, ETC
SEVERAL THOUSAND PROTECTED FILES
THIS LEAVES US 2.5TB OF DATA AND 14 FILING CABINETS OF
HARD COPY DATA
7. Discover
CRACK IT AND OCR
ONCE WE CRACKED THE POTENTIALLY USEFUL FILES WE USED
OPTIMAL CHARACTER RECOGNITION TOOLS TO IMPORT ALL THE
DATA INTO A SINGLE PLATFORM.
WORKING CLOSELY WITH THE ADMINISTRATORS WE GAINED
KNOWLEDGE OF THE BUSINESS, THE KEY PLAYERS, WHAT CAUSED
THE FINANCIAL ISSUE, AND REVISITED THE BUSINESS GOALS.
8. Understand
SEARCH THE DATA
USING KEY WORDS AND PHRASES WE UNDERTOOK ENTITY
EXTRACTION TO AID THE INVESTIGATION.
WORKING CLOSELY WITH THE ADMINISTRATORS WE GAINED
KNOWLEDGE OF THE BUSINESS, THE KEY PLAYERS, WHAT
CAUSED THE FINANCIAL ISSUE, AND REVISITED THE BUSINESS
GOALS.
ONCE INGESTED SEARCHING ACROSS THE DATA AIDS
INVESTIGATION AND ANALYSIS.
ONCE LEADS ARE FOUND CASE FILE CREATION STORES LEADS
AND DRIVES FURTHER INVESTIGATIONS.
IN THIS INSTANCE SOME 40 CASE FILES WERE CREATED.
9. Evolve
VISUAL THE DATA
USING CASE FILES AND VISUALISATION SEARCHES
ALLOWS INVESTIGATORS TO CLEARLY SEE DATA IN A
NON TECHNICAL ENVIRONMENT.
10. Evolve
THE RESULT
THROUGH THIS INVESTIGATION INTO THIS LARGELY FAMILY RUN TRAVEL AGENCIES MILLIONS OF EURO’S OF ASSETS WERE UNCOVERED.
THE TRAVEL AGENCY WAS REMOVED FROM ADMINISTRATION AND ALL DEBTS WERE SETTLED.
PRESENTING THE DATA IN SIMPLE TO UNDERSTAND VISUALISATIONS WITH THE ABILITY TO EMPOWER A NON TECHNICAL AUDIENCE
WITH KEY BI SAVED THIS COMPANY AND PREVENTED MULTIPLE CLIENTS AND COMPANIES LOSING MILLIONS OF EURO’S.
DATED TECHNOLOGY WAS REPLACED AND STRICT CONTROLS AND BUSINESS RULES WERE APPLIED.
THE COMPANY IS STILL SUCCESSFULLY TRADING TODAY.
11. The Dark Web – Its not that scary
– THE DEEP WEB CONTAINS 7500 TERABYTES OF INFORMATION. THE SURFACE WEB, IN
COMPARISON, CONTAINS 19 TERABYTES OF CONTENT.
– MORE THAN 200,000 DEEP WEB SITES CURRENTLY EXIST.
– TOGETHER, THE 60 LARGEST DEEP WEB SITES CONTAIN AROUND 750 TERABYTES OF DATA,
SURPASSING THE SIZE OF THE ENTIRE SURFACE WEB 40 TIMES.
– THE TOTAL QUALITY OF THE DEEP WEB IS 1,000 TO 2,000 TIMES GREATER THAN THE
QUALITY OF THE SURFACE WEB.
– 550 BILLION INDIVIDUAL DOCUMENTS CAN BE FOUND ON THE DEEP WEB COMPARED TO
THE SURFACE WEB’S 1 BILLION INDIVIDUAL DOCUMENTS.
– 95% OF THE DEEP WEB IS PUBLICLY ACCESSIBLE, MEANING NO FEES OR SUBSCRIPTIONS.
Source - https://hewilson.wordpress.com/what-is-the-deep-web/statistics/
12. UK Social Data Stats
39+ MILLION USERS
20+ MILLION USERS
21+ MILLION USERS
14+ MILLION USERS
65+ MILLION PEOPLE – 60+ MILLION INTERNET USERS
13. Evolve
THE FUTURE
• DATA ENRICHMENT IS A KEY PART OF ANY DATA
INVESTIGATION.
• SOCIAL DATA IS GROWING EXPONENTIALLY YEAR ON
YEAR.
• IT IS ESTIMATED ONLY 5% OF SCAM VICTIMS TELL THEIR
BANK WHILE 75% TALK ABOUT IT ON SOCIAL MEDIA.
• SOCIAL MEDIA IS KEY TO BUILDING KYC.
• OPEN SOURCE SUCH AS COMPANIES HOUSE IS ALL
INGESTIBLE TO ENRICH YOUR DATASETS.
IF YOU DON’T LOOK AFTER YOUR CUSTOMERS
SOMEONE ELSE WILL.
14. Come and Say Hello
PLEASE COME AND VISIT US AT OUR BOOTH
TALK TO US ABOUT -
- THE GRSC DATA PLATFORM
- FRAUD
- ANALYTICS
- AML/ KYC
- SOCIAL MEDIA ANALYTICS
- THE DARK WEB
- COMPLIANCE
JOIN ME AT 1500 IN THIS ROOM FOR A 30 MINUTE TAKE AWAY SESSION WITH SOME COOL FRAUD STUFF THROWN IN