The role of digital technology is rapidly shifting, from being a driver of marginal efficiency to an enabler of fundamental innovation and disruption, according to a white paper on digital enterprises by the World Economic Forum. The digital economy has changed the world of business, levelling the playground for newer entrants to compete head on with larger traditional enterprises.
In order to be competitive in today’s digital economy, organizations need to take steps to become digitally mature. This can be done both through internal and external digital innovations and transformations including
Transforming existing legacy systems via an integration layer
Building a macro or micro-services layer coupled with leaner devops for faster time-to-market
Enabling API driven stakeholder-inclusive businesses
Identifying new business insights via analytics
As a result of the changing landscape to many industries, fraud is a growing problem – accelerating in both the number of incidences and in its complexity. In Canada, we estimate that the total impact of fraud is close to $2 billion – stemming from both losses and cost of operations. These challenges have opened the door for Symcor to offer industry-leading digital and data services to detect and prevent current and emerging fraud. Hortonworks worked with us to transform our service offerings with industry-leading solutions to drive digital and data services. One of our most significant features we developed in the delivery of digital and data services was how we have provided industry-leading data governance and security solutions to protect our clients’ data. To ensure we were doing the right things, Symcor’s Privacy and Data Governance team designed a comprehensive data governance policy to empower the ethical use of data for fraud detection and prevention. Hortonworks was able to provide us with a platform that allowed for the development of industry-leading solutions that allowed us to integrate our comprehensive data governance policies within these technology solutions.
Thanks to Hortonworks assistance, we were able to deliver the objectives listed above. This solution has helped Symcor enhance its business offerings from a leading business processing provider in Canada, to an industry-leading digital and data services provider. CHRIS WOJDAK, Sr. Program\Managing Architect Leader, Symcor Inc and MIKE MACDONALD
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix WebinarImpetus Technologies
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix Webinar
View the webcast on http://bit.ly/1HFD8YR
The speakers from Forrester and Impetus talk about the options and optimal architecture to incorporate real-time insights into your apps that provisions benefitting from future innovation also.
Big Data as a Service: A Neo-Metropolis Model Approach for InnovationSoftServe
Presented at The Hawaii International Conference on System Sciences by Hong-Mei Chen and Rick Kazman (University of Hawaii), Serge Haziyev and Valentyn Kropov (SoftServe), Dmitri Chtchoutov.
The Big Picture: Real-time Data is Defining Intelligent OffersCloudera, Inc.
New research shows that 57% of the buying cycle is completed before a prospect even speaks to a company. Marketers already know this, Ninety-six percent (96%) of organizations believe that email personalization can improve email marketing performance. But where do we get this increasingly personal direction? The answer is likely in your customer data. In order to understand your customer needs contextualized in the moment they feel the need to act you will require a platform that can leverage real-time data. Apache Kudu is a Cloudera component that makes dealing with quickly changing data fast and easy. Companies are leveraging next generation data stores like Kudu to build data applications that deliver smart promotions, real-time offers, and personalized marketing. Join us as we discuss modern approaches to real-time application development and highlight key Cloudera use cases being powered by Cloudera’s operational database.
As a result of the changing landscape to many industries, fraud is a growing problem – accelerating in both the number of incidences and in its complexity. In Canada, we estimate that the total impact of fraud is close to $2 billion – stemming from both losses and cost of operations. These challenges have opened the door for Symcor to offer industry-leading digital and data services to detect and prevent current and emerging fraud. Hortonworks worked with us to transform our service offerings with industry-leading solutions to drive digital and data services. One of our most significant features we developed in the delivery of digital and data services was how we have provided industry-leading data governance and security solutions to protect our clients’ data. To ensure we were doing the right things, Symcor’s Privacy and Data Governance team designed a comprehensive data governance policy to empower the ethical use of data for fraud detection and prevention. Hortonworks was able to provide us with a platform that allowed for the development of industry-leading solutions that allowed us to integrate our comprehensive data governance policies within these technology solutions.
Thanks to Hortonworks assistance, we were able to deliver the objectives listed above. This solution has helped Symcor enhance its business offerings from a leading business processing provider in Canada, to an industry-leading digital and data services provider. CHRIS WOJDAK, Sr. Program\Managing Architect Leader, Symcor Inc and MIKE MACDONALD
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix WebinarImpetus Technologies
Future-Proof Your Streaming Analytics Architecture- StreamAnalytix Webinar
View the webcast on http://bit.ly/1HFD8YR
The speakers from Forrester and Impetus talk about the options and optimal architecture to incorporate real-time insights into your apps that provisions benefitting from future innovation also.
Big Data as a Service: A Neo-Metropolis Model Approach for InnovationSoftServe
Presented at The Hawaii International Conference on System Sciences by Hong-Mei Chen and Rick Kazman (University of Hawaii), Serge Haziyev and Valentyn Kropov (SoftServe), Dmitri Chtchoutov.
The Big Picture: Real-time Data is Defining Intelligent OffersCloudera, Inc.
New research shows that 57% of the buying cycle is completed before a prospect even speaks to a company. Marketers already know this, Ninety-six percent (96%) of organizations believe that email personalization can improve email marketing performance. But where do we get this increasingly personal direction? The answer is likely in your customer data. In order to understand your customer needs contextualized in the moment they feel the need to act you will require a platform that can leverage real-time data. Apache Kudu is a Cloudera component that makes dealing with quickly changing data fast and easy. Companies are leveraging next generation data stores like Kudu to build data applications that deliver smart promotions, real-time offers, and personalized marketing. Join us as we discuss modern approaches to real-time application development and highlight key Cloudera use cases being powered by Cloudera’s operational database.
Build it…will they come by Shawn TrainerData Con LA
Abstract:- The truth about enabling self-service (and why you need it) Data is growing astronomically, historically and in real-time. So is the need for exploration and discovery. One size doesn’t fit all. We’ll be covering how to efficiently deliver information on-demand and promote self-service adoption with the right data platform.
AI Data Acquisition and Governance: Considerations for SuccessDatabricks
data pipeline, governance, and for growth and updating models regularly needs to be part of the AI strategy from the outset.
This session will cover:
Defining AI governance: What this means and how definitions of subjects like ethics and effectiveness can differ between organizations.
Data governance: Companies must rely on an AI governance program to ensure only high-quality, unbiased and consistent data are used in training.
AI is a growing necessity for enterprises / businesses; it provides an avenue for scaling quickly and efficiently.
Best practices / implementation: how to implement AI that meets the requirements of the organization’s defined sets of governances.
Planning the data pipeline and growing/updating the models: AI is not static in the real world; models must be frequently updated to maintain relevance and accuracy.
3 key takeaways or attendee benefits of the session:
Understand how to assess your organization’s need for AI; how to identify the opportune areas for transforming processes, interactions, scaling, cost.
How to start the implementation process. Defining data and AI governance and how to build the training data pipeline within that framework.
Best practices for maintaining AI; how to use data to evaluate models and continuously iterate on them to reflect the real world.
Fighting financial fraud at Danske Bank with artificial intelligenceRon Bodkin
Danske Bank, the leader in mobile payments in Denmark, is innovating with AI. Danske Bank’s existing fraud detection engine is being enhanced with deep learning algorithms that can analyze potentially tens of thousands of latent features. Danske Bank’s current system is largely based on handcrafted rules created by the business, based on intuition and some light analysis. The system is effective at blocking fraud, but it has a high rate of false positives, which is expensive and inconvenient, and it has proved impractical to update and maintain as fraudsters evolve their capabilities. Moreover, the bank understands that fraud is getting worse in the near- and long-term future due to the increased digitization of banking and the prevalence of mobile banking applications and recognizes the need to use cutting-edge techniques to engage fraudsters not where they are today but where they will be tomorrow.
Application fraud is an important emerging trend, in which machines fill in transaction forms. There is evidence that criminals are employing sophisticated machine-learning techniques to attack, so it’s critical to use sophisticated machine learning to catch fraud in banking and mobile payment transactions.
Ron Bodkin and Nadeem Gulzar explore how Danske Bank uses deep learning for better fraud detection. Danske Bank’s multistep program first productionizes “classic” machine learning techniques (boosted decision trees) while in parallel developing deep learning models with TensorFlow as a “challenger” to test. The system was first tested in shadow production and then in full production in a champion-challenger setup against live transactions. Ron and Nadeem explain how the bank is integrating the models with the efforts already running, giving the bank and its investigation team the ability to adapt to new patterns faster than before and taking on complex highly varying functions not present in the training examples.
What are actionable insights? (Introduction to Operational Analytics Software)Newton Day Uploads
What Are Actionable Insights? In this presentation I outline what Actionable Insights are and the Operational Analytics Software that can produce them. And because Business Intelligence and the Business Intelligence Software market can be so confusing for buyers I've attempted to position where Actionable Insights and Operational Analytics fit in the Business Intelligence 'story'.
ATAAS2016 - Big data analytics – data visualization himanshu and santoshAgile Testing Alliance
Big Data Analytics – Data Visualization There are a significant amount of challenges in the industry to process and visualize large amount of data. Interestingly, stakeholders still believe a lot on data visualization on real-time basis is quite important to take decisions which are data driven and help decision makers taking important business decisions. Through the session, I will cover the overview of reporting, data visualization, tools and helping participants to take smart decisions. As part of the session, I will demo the automation process using tools, internet technology, web services and database engines to visualize aggregated and meaningful decisions
Qonnections2015 - Why Qlik is better with Big DataJohn Park
Big Data is everywhere. We generate enough data to track every single transactions, sensors, and interactions. But what do we do with this? Using Jethro Data and Qlik you can easily unlock the value of Big Data. Qlik can help you find outliers and trends in Billions rows and make what seems long esoteric process easy. The presentation will go through the problem statement and Qlik like approach to solving the problem. We will also introduce Jethro data a new Qlik Partner to the Our Partners
Advanced Analytics and Machine Learning with Data VirtualizationDenodo
Watch: https://bit.ly/2DYsUhD
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spent most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Attend this webinar and learn:
- How data virtualization can accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- How popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc. integrate with Denodo
- How you can use the Denodo Platform with large data volumes in an efficient way
- How Prologis accelerated their use of Machine Learning with data virtualization
CTO of ParStream Joerg Bienert hold a presentation on February 25, 2014 about Big Data for Business Users. He talked about several use cases of current ParStream customers and ParStreams' technology itself.
Big Data Analytic with Hadoop: Customer StoriesYellowfin
Why watch?
Looking to analyze your growing data assets to unlock real business benefits today? But, are you sick of all the Big Data hype and whoopla?
Watch this on-demand Webinar from Actian and Yellowfin – Big Data Analytics with Hadoop – to discover how we’re making Big Data Analytics fast and easy:
Learn how a telecommunications provider has already transformed its business using Big Data Analytics with Hadoop.
Hold on as we go from data in Hadoop to predictive analytics in just 40-minutes.
Learn how to combine Hadoop with the most advanced Big Data technologies, and world’s easiest BI solution, to quickly generate real business value from Big Data Analytics.
What will you learn?
Discover how Actian’s market-leading Big Data Analytics technologies, combined with Yellowfin’s consumer-oriented platform for reporting and analytics, makes generating value from Big Data Analytics faster and easier than you thought possible.
Join us as we demonstrate how to:
• Connect to, prepare and optimize Big Data in Hadoop for reporting and analytics.
• Perform predictive analytics on streaming Big Data: Learn how to empower all your analytics stakeholders to move from historical reports to predictive analytics and gain a sustainable competitive advantage.
• Communicate insights attained from Big Data: Optimize the value of your Big Data insights by learning how to effectively communicate analytical information to defined user groups and types.
This Webinar is ideal if…
• You want to act on more data and data types in shorter timeframes
• You want to understand the steps involved in achieving Big Data success – both front and back end
• You want to see how market leaders are leveraging Big Data to become data-driven organizations today
Looking to analyze and exploit Big Data assets stored in Hadoop? Then this Webinar is a must.
Freddie Mac makes homeownership and rental housing more accessible and affordable. Operating in the secondary mortgage market, we keep mortgage capital flowing by purchasing mortgage loans from lenders so they in turn can provide more loans to qualified borrowers. Our mission to provide liquidity, stability, and affordability to the U.S. housing market in all economic conditions extends to all communities from coast to coast.
We're using big data and advanced analytics to create powerful enhancements to better meet our customer’s needs: automated collateral evaluation, automated assessments for borrowers without credit scores, immediate certainty for collateral rep and warranty relief, and coming soon automated asset and income validation.
We’re building tools to help our customers cut costs and give them rep and warranty relief sooner in the loan manufacturing process.
We’ve designed Loan Advisor Suite with lenders to give our customers greater certainty, usability, reliability and efficiency. It's a simpler, better way to do business.
More Tools - Access powerful solutions for every stage of the loan production process.
More Loans - Increase output with automated data management and user-friendly controls.
Less Risk = Get alerted to loan issues and take action the moment they occur.
Hear the story of how ACE helped Freddie Mac reimagine the mortgage process and how HDP helped make it possible.
Speaker
Dennis Tally, Freddie Mac, Director
Logitech Accelerates Cloud Analytics Using Data Virtualization by Avinash Des...Data Con LA
Abstract:- Many firms are adopting a cloud first strategy and are migrating their on-premises technologies to the cloud. Logitech is one of them. We have adopted the AWS platform and big data on the cloud for all of their analytical needs, including Amazon Redshift and S3. In this presentation, I will present: The business rationale for migrating to the cloud. How data virtualization enables the migration. Running data virtualization itself in the cloud.
Watch here: https://bit.ly/2D1fqB6
Today’s evolving data landscape has spawned new business challenges that require innovative solutions. These challenges include:
- Strategic decision-making, which relies on multiple perspectives such as social and economic factors that require combining internal and external data.
- Accounting for the increased volume and structural complexity of today’s data, and increased frequency required in delivering data assets.
- Coping with data silos that house data that must be combined and provisioned to support decision-making.
- Exposing purpose-built analytics, such as supply chain, for consumption in order to expedite decision-making.
Attend this session to learn how Data as a Service, fueled by data virtualization, overcomes these common challenges from the three dimensions of:
- Provisioning information-rich external data assets,
- Connecting data silos, and
- Enabling pre-built and packaged analytics.
How to Swiftly Operationalize the Data Lake for Advanced Analytics Using a Lo...Denodo
Watch full webinar here: https://bit.ly/3mfFJqb
Presented at Chief Data Officer Live Series 2021, ASEAN (August Edition)
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best-of-breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform, and provide real-time data integration while delivering a self-service data platform to business users.
Watch this on-demand session to learn how big data fabric enabled by Data Virtualization:
- Provides lightning fast self-service data access to business users
- Centralizes data security, governance, and data privacy
- Fulfills the promise of data lakes to provide actionable insights
Big Data Explained - Case study: Website Analyticsdeep.bi
This is an example case study showing what big data can mean for a small website that generates just 5000 visits a day.
It all depends on what we want do get from our assets like website traffic. If we only measure the number of people who visited our site, then we do not need to worry about “big data”. We just have to count total visits (5000 a day, 150 000 monthly).
But by using just the simple measure we know nothing about our visitors / customers. So, it pretty useless.
On the following slides we present what a website owner can gain from advanced website analytics and why big data technologies are recommended.
apidays LIVE Hong Kong - The Future of Legacy - How to leverage legacy and on...apidays
apidays LIVE Hong Kong - The Open API Economy: Finance-as-a-Service & API Ecosystems
The Future of Legacy - How to leverage legacy and on-prem assets in your digital transformation with Digital-Driven Integration
Zeev Avidan, Chief Product Officer of OpenLegacy
Build it…will they come by Shawn TrainerData Con LA
Abstract:- The truth about enabling self-service (and why you need it) Data is growing astronomically, historically and in real-time. So is the need for exploration and discovery. One size doesn’t fit all. We’ll be covering how to efficiently deliver information on-demand and promote self-service adoption with the right data platform.
AI Data Acquisition and Governance: Considerations for SuccessDatabricks
data pipeline, governance, and for growth and updating models regularly needs to be part of the AI strategy from the outset.
This session will cover:
Defining AI governance: What this means and how definitions of subjects like ethics and effectiveness can differ between organizations.
Data governance: Companies must rely on an AI governance program to ensure only high-quality, unbiased and consistent data are used in training.
AI is a growing necessity for enterprises / businesses; it provides an avenue for scaling quickly and efficiently.
Best practices / implementation: how to implement AI that meets the requirements of the organization’s defined sets of governances.
Planning the data pipeline and growing/updating the models: AI is not static in the real world; models must be frequently updated to maintain relevance and accuracy.
3 key takeaways or attendee benefits of the session:
Understand how to assess your organization’s need for AI; how to identify the opportune areas for transforming processes, interactions, scaling, cost.
How to start the implementation process. Defining data and AI governance and how to build the training data pipeline within that framework.
Best practices for maintaining AI; how to use data to evaluate models and continuously iterate on them to reflect the real world.
Fighting financial fraud at Danske Bank with artificial intelligenceRon Bodkin
Danske Bank, the leader in mobile payments in Denmark, is innovating with AI. Danske Bank’s existing fraud detection engine is being enhanced with deep learning algorithms that can analyze potentially tens of thousands of latent features. Danske Bank’s current system is largely based on handcrafted rules created by the business, based on intuition and some light analysis. The system is effective at blocking fraud, but it has a high rate of false positives, which is expensive and inconvenient, and it has proved impractical to update and maintain as fraudsters evolve their capabilities. Moreover, the bank understands that fraud is getting worse in the near- and long-term future due to the increased digitization of banking and the prevalence of mobile banking applications and recognizes the need to use cutting-edge techniques to engage fraudsters not where they are today but where they will be tomorrow.
Application fraud is an important emerging trend, in which machines fill in transaction forms. There is evidence that criminals are employing sophisticated machine-learning techniques to attack, so it’s critical to use sophisticated machine learning to catch fraud in banking and mobile payment transactions.
Ron Bodkin and Nadeem Gulzar explore how Danske Bank uses deep learning for better fraud detection. Danske Bank’s multistep program first productionizes “classic” machine learning techniques (boosted decision trees) while in parallel developing deep learning models with TensorFlow as a “challenger” to test. The system was first tested in shadow production and then in full production in a champion-challenger setup against live transactions. Ron and Nadeem explain how the bank is integrating the models with the efforts already running, giving the bank and its investigation team the ability to adapt to new patterns faster than before and taking on complex highly varying functions not present in the training examples.
What are actionable insights? (Introduction to Operational Analytics Software)Newton Day Uploads
What Are Actionable Insights? In this presentation I outline what Actionable Insights are and the Operational Analytics Software that can produce them. And because Business Intelligence and the Business Intelligence Software market can be so confusing for buyers I've attempted to position where Actionable Insights and Operational Analytics fit in the Business Intelligence 'story'.
ATAAS2016 - Big data analytics – data visualization himanshu and santoshAgile Testing Alliance
Big Data Analytics – Data Visualization There are a significant amount of challenges in the industry to process and visualize large amount of data. Interestingly, stakeholders still believe a lot on data visualization on real-time basis is quite important to take decisions which are data driven and help decision makers taking important business decisions. Through the session, I will cover the overview of reporting, data visualization, tools and helping participants to take smart decisions. As part of the session, I will demo the automation process using tools, internet technology, web services and database engines to visualize aggregated and meaningful decisions
Qonnections2015 - Why Qlik is better with Big DataJohn Park
Big Data is everywhere. We generate enough data to track every single transactions, sensors, and interactions. But what do we do with this? Using Jethro Data and Qlik you can easily unlock the value of Big Data. Qlik can help you find outliers and trends in Billions rows and make what seems long esoteric process easy. The presentation will go through the problem statement and Qlik like approach to solving the problem. We will also introduce Jethro data a new Qlik Partner to the Our Partners
Advanced Analytics and Machine Learning with Data VirtualizationDenodo
Watch: https://bit.ly/2DYsUhD
Advanced data science techniques, like machine learning, have proven an extremely useful tool to derive valuable insights from existing data. Platforms like Spark, and complex libraries for R, Python and Scala put advanced techniques at the fingertips of the data scientists. However, these data scientists spent most of their time looking for the right data and massaging it into a usable format. Data virtualization offers a new alternative to address these issues in a more efficient and agile way.
Attend this webinar and learn:
- How data virtualization can accelerate data acquisition and massaging, providing the data scientist with a powerful tool to complement their practice
- How popular tools from the data science ecosystem: Spark, Python, Zeppelin, Jupyter, etc. integrate with Denodo
- How you can use the Denodo Platform with large data volumes in an efficient way
- How Prologis accelerated their use of Machine Learning with data virtualization
CTO of ParStream Joerg Bienert hold a presentation on February 25, 2014 about Big Data for Business Users. He talked about several use cases of current ParStream customers and ParStreams' technology itself.
Big Data Analytic with Hadoop: Customer StoriesYellowfin
Why watch?
Looking to analyze your growing data assets to unlock real business benefits today? But, are you sick of all the Big Data hype and whoopla?
Watch this on-demand Webinar from Actian and Yellowfin – Big Data Analytics with Hadoop – to discover how we’re making Big Data Analytics fast and easy:
Learn how a telecommunications provider has already transformed its business using Big Data Analytics with Hadoop.
Hold on as we go from data in Hadoop to predictive analytics in just 40-minutes.
Learn how to combine Hadoop with the most advanced Big Data technologies, and world’s easiest BI solution, to quickly generate real business value from Big Data Analytics.
What will you learn?
Discover how Actian’s market-leading Big Data Analytics technologies, combined with Yellowfin’s consumer-oriented platform for reporting and analytics, makes generating value from Big Data Analytics faster and easier than you thought possible.
Join us as we demonstrate how to:
• Connect to, prepare and optimize Big Data in Hadoop for reporting and analytics.
• Perform predictive analytics on streaming Big Data: Learn how to empower all your analytics stakeholders to move from historical reports to predictive analytics and gain a sustainable competitive advantage.
• Communicate insights attained from Big Data: Optimize the value of your Big Data insights by learning how to effectively communicate analytical information to defined user groups and types.
This Webinar is ideal if…
• You want to act on more data and data types in shorter timeframes
• You want to understand the steps involved in achieving Big Data success – both front and back end
• You want to see how market leaders are leveraging Big Data to become data-driven organizations today
Looking to analyze and exploit Big Data assets stored in Hadoop? Then this Webinar is a must.
Freddie Mac makes homeownership and rental housing more accessible and affordable. Operating in the secondary mortgage market, we keep mortgage capital flowing by purchasing mortgage loans from lenders so they in turn can provide more loans to qualified borrowers. Our mission to provide liquidity, stability, and affordability to the U.S. housing market in all economic conditions extends to all communities from coast to coast.
We're using big data and advanced analytics to create powerful enhancements to better meet our customer’s needs: automated collateral evaluation, automated assessments for borrowers without credit scores, immediate certainty for collateral rep and warranty relief, and coming soon automated asset and income validation.
We’re building tools to help our customers cut costs and give them rep and warranty relief sooner in the loan manufacturing process.
We’ve designed Loan Advisor Suite with lenders to give our customers greater certainty, usability, reliability and efficiency. It's a simpler, better way to do business.
More Tools - Access powerful solutions for every stage of the loan production process.
More Loans - Increase output with automated data management and user-friendly controls.
Less Risk = Get alerted to loan issues and take action the moment they occur.
Hear the story of how ACE helped Freddie Mac reimagine the mortgage process and how HDP helped make it possible.
Speaker
Dennis Tally, Freddie Mac, Director
Logitech Accelerates Cloud Analytics Using Data Virtualization by Avinash Des...Data Con LA
Abstract:- Many firms are adopting a cloud first strategy and are migrating their on-premises technologies to the cloud. Logitech is one of them. We have adopted the AWS platform and big data on the cloud for all of their analytical needs, including Amazon Redshift and S3. In this presentation, I will present: The business rationale for migrating to the cloud. How data virtualization enables the migration. Running data virtualization itself in the cloud.
Watch here: https://bit.ly/2D1fqB6
Today’s evolving data landscape has spawned new business challenges that require innovative solutions. These challenges include:
- Strategic decision-making, which relies on multiple perspectives such as social and economic factors that require combining internal and external data.
- Accounting for the increased volume and structural complexity of today’s data, and increased frequency required in delivering data assets.
- Coping with data silos that house data that must be combined and provisioned to support decision-making.
- Exposing purpose-built analytics, such as supply chain, for consumption in order to expedite decision-making.
Attend this session to learn how Data as a Service, fueled by data virtualization, overcomes these common challenges from the three dimensions of:
- Provisioning information-rich external data assets,
- Connecting data silos, and
- Enabling pre-built and packaged analytics.
How to Swiftly Operationalize the Data Lake for Advanced Analytics Using a Lo...Denodo
Watch full webinar here: https://bit.ly/3mfFJqb
Presented at Chief Data Officer Live Series 2021, ASEAN (August Edition)
While big data initiatives have become necessary for any business to generate actionable insights, big data fabric has become a necessity for any successful big data initiative. The best-of-breed big data fabrics should deliver actionable insights to the business users with minimal effort, provide end-to-end security to the entire enterprise data platform, and provide real-time data integration while delivering a self-service data platform to business users.
Watch this on-demand session to learn how big data fabric enabled by Data Virtualization:
- Provides lightning fast self-service data access to business users
- Centralizes data security, governance, and data privacy
- Fulfills the promise of data lakes to provide actionable insights
Big Data Explained - Case study: Website Analyticsdeep.bi
This is an example case study showing what big data can mean for a small website that generates just 5000 visits a day.
It all depends on what we want do get from our assets like website traffic. If we only measure the number of people who visited our site, then we do not need to worry about “big data”. We just have to count total visits (5000 a day, 150 000 monthly).
But by using just the simple measure we know nothing about our visitors / customers. So, it pretty useless.
On the following slides we present what a website owner can gain from advanced website analytics and why big data technologies are recommended.
apidays LIVE Hong Kong - The Future of Legacy - How to leverage legacy and on...apidays
apidays LIVE Hong Kong - The Open API Economy: Finance-as-a-Service & API Ecosystems
The Future of Legacy - How to leverage legacy and on-prem assets in your digital transformation with Digital-Driven Integration
Zeev Avidan, Chief Product Officer of OpenLegacy
Neev Information technologies has a expertise is Web , Mobile, Social applications. It has come up with the Cloud services on collaboration with Amazon. The company is growing with more than 50% YOY. Has a great future ahead.
Apidays Paris 2023 - Building APIs At Scale, Ado Trakic, Capital Oneapidays
Apidays Paris 2023 - Software and APIs for Smart, Sustainable and Sovereign Societies
December 6, 7 & 8, 2023
Building APIs At Scale: Delivering Products Faster
Ado Trakic, Enterprise Architect - API CoE at Capital One
------
Check out our conferences at https://www.apidays.global/
Do you want to sponsor or talk at one of our conferences?
https://apidays.typeform.com/to/ILJeAaV8
Learn more on APIscene, the global media made by the community for the community:
https://www.apiscene.io
Explore the API ecosystem with the API Landscape:
https://apilandscape.apiscene.io/
Startup pitch presented by co-founder and CEO Jaco Els. Cubitic offers a predictive analytics platform that allows developers to build custom solutions for analytics and visualisation on top of a machine learning engine.
Asyma E3 2014 The Impact of Cloud Computing on SME'sasyma
Why do you use the “CLOUD”? Do you want to? Do you have a choice? What does it cost to participate? What impact does Cloud Computing have on my business and why should I care? We’ll look at the cloud from both sides now.
Explore our analysis of technology trends for 2019 and beyond: AI, IoT, Security, Big Data / Data Science, Mobile Apps Development, AR/VR, RPA (Robot Process Automation), Blockchain, Automotive Solutions, Business Intelligence, Cloud Computing, Service Desk, Autonomous Things, Augmented Analytics, AI-Driven Development, Digital Twins, Empowered Edge, Immersive Experience, Smart Spaces, Quantum Computing, and more.
Check our recommendations for businesses to stay current with the latest IT tendencies.
Includes a video by Gartner.
Lo que se viene: ¿Cómo escribirás tu futuro? - Laura VoglinoGeneXus
El mundo de hoy, marcado por las siguientes shifts en tecnología que conviven al mismo tiempo -los datos, la nube y la movilidad, ¿cómo podemos preveer la construcción de una diferenciación sostenible para ofrecer valor durarero en el tiempo? ¿Qué vamos a hacer de este momento como empresas, como individuos, como comunidades?
•Explotar los datos para redefinir su posición en la industria
•Capitalizar en la Nube para la reinvención de un modelo de negocios
•Involucrarse con su ecosistema -socios, desarrolladores y empleados- para lograr una ágil innovación.
For many, web-scale IT is an alien and drastic approach being met with fear and resistance. So the first question for any organization should be; what is it? Cameron Haight, Gartner’s chief of research for infrastructure and operations, coined the term “Web-scale IT” earlier 2014 as a way to describe the new ways organizations leverage technology to provide their customers with content quickly and at massive scale.
Check the infographic on our blog: http://www.euroitgroup.com/telecom-2020-infographic/
The telecom industry will continue to play a central role in addressing a whole range of social, economic challenges or issues. “By working together, the mobile industry is truly connecting everyone and everything to a better future. That must be at the forefront in everything we do role of mobile technology in improving tomorrow’s society”.
Digital Architecture – The Missing Link in Digital Transformation SuccessNUS-ISS
Today, every business is a Digital Business. And Digital Architecture is the critical enabler of successful Organisation’s Digital Strategy and Transformation. Without Digital Architecture, most organisation’s Digital Transformation efforts will fail or may not derive the full benefits of their Digital Strategy. In this session, we will distil the essence of Digital Architecture best practices of successful organisations into a set of principles, framework and toolkit that participants can apply to their organisations to drive successful Digital Transformation.
Transforming Software Architecture for the 21st Century (September 2009)Dion Hinchcliffe
Evolving an important theme I've been working on and presenting all year, this new deck summarizes how enterprise architecture and large scale technology-based business solutions must transform to be more effective in the 21st century.
Contains material on a hypothesis for what's wrong with today's EA as well as potential solutions of merit such as emergent architecture, WOA, enterprise REST, open supply chains (APIs), mashups, and other models.
Presented this week in Oslo Norway to Bouvet's enterprise architecture council.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
Less Is More: Utilizing Ballerina to Architect a Cloud Data PlatformWSO2
At its core, the challenge of managing Human Resources data is an integration challenge: estimates range from 2-3 HR systems in use at a typical SMB, up to a few dozen systems implemented amongst enterprise HR departments, and these systems seldom integrate seamlessly between themselves. Providing a multi-tenant, cloud-native solution to integrate these hundreds of HR-related systems, normalize their disparate data models and then render that consolidated information for stakeholder decision making has been a substantial undertaking, but one significantly eased by leveraging Ballerina. In this session, we’ll cover:
The overall software architecture for VHR’s Cloud Data Platform
Critical decision points leading to adoption of Ballerina for the CDP
Ballerina’s role in multiple evolutionary steps to the current architecture
Roadmap for the CDP architecture and plans for Ballerina
WSO2’s partnership in bringing continual success for the CD
The integration landscape is changing rapidly with the introduction of technologies like GraphQL, gRPC, stream processing, iPaaS, and platformless. However, not all existing applications and industries can keep up with these new technologies. Certain industries, like manufacturing, logistics, and finance, still rely on well-established EDI-based message formats. Some applications use XML or CSV with file-based communications, while others have strict on premises deployment requirements. This talk focuses on how Ballerina's built-in integration capabilities can bridge the gap between "old" and "new" technologies, modernizing enterprise applications without disrupting business operations.
Platformless Horizons for Digital AdaptabilityWSO2
In this keynote, Asanka Abeysinghe, CTO,WSO2 will explore the shift towards platformless technology ecosystems and their importance in driving digital adaptability and innovation. We will discuss strategies for leveraging decentralized architectures and integrating diverse technologies, with a focus on building resilient, flexible, and future-ready IT infrastructures. We will also highlight WSO2's roadmap, emphasizing our commitment to supporting this transformative journey with our evolving product suite.
Quantum computers are rapidly evolving and are promising significant advantages in domains like machine learning or optimization, to name but a few areas. In this keynote we sketch the underpinnings of quantum computing, show some of the inherent advantages, highlight some application areas, and show how quantum applications are built.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
2. Session Agenda - Integration
o The case for digital transformation
o Digital transformation – five technology trends
o A technology platform for digitation
transformation
o Ten steps to getting there – platform of
platforms
o Conclusion
4. • Digital
age
PR
• Games
and
Movies
• Free
games
• Lego
movie
• Crowdsourcing
and
customer
value
• Ideas.lego.com
• Youtube
content
• Cartoons
• Micro-‐sites
• Co-‐branding
LEGO: Building customer experience, brick by brick
5. The
democraAzaAon
of
technology
(driven
by
its
plummeAng
cost),
increased
access
to
funds
and
a
rising
entrepreneurial
culture
means
that
there
are
now
hundreds
of
startups
aIacking
tradiAonal
markets.
Uber,
Twitch,
Tesla,
Hired,
Clinkle,
Beyond
Verbal,
Vayable,
GitHub,
WhatsApp,
Airbnb,
MaIernet,
Snapchat,
Homejoy,
Waze
and
the
list
goes
on.
These
startups
are
achieving
scale
far
quicker
than
analog
companies
ever
did.
Whereas
the
average
Fortune
500
company
took
20
years
to
reach
a
market
capitalizaAon
of
$1
billion,
Google
managed
it
in
eight
years,
and
the
likes
of
Uber,
Snapchat
and
Xiaomi
in
three
years
or
less.
(Source:
Accenture/WEF
2016)
6. Digital transformation – Five technology trends
Pla^orm
RevoluAon
Customer
centric
Business
Rapid
InnovaAon
Big
data
insight
as
an
Asset
Workforce
reimagined
8. Pla^orm
(R)evoluAon
–
from
CompeAAon
to
CoopeAAon
Business
focus:
• Pla^orm
business
model
• Network
effect
• Blurred
borders
between
partners
and
compeAtors
• Combined
value
• Other
examples:
Uber,
AirBnB
Technology
focus
• Technology
Pla^orm
to
build
a
Pla^orm
Business
• IntegraAon
pla^orm
used
to
connect
stakeholders
10. Customer
centric
business
–
the
Internet
of
Me!
Business
focus:
• Customers
as
the
core
driver
• Social
media
and
networks
• PersonalisaAon
of
apps
and
services
• Sensors,
wearables
and
IoT
providing
a
constant
stream
of
data
Technology
focus
• Internet
of
Things
combined
with
customer
experiennce
11. Using
WSO2
AnalyAcs,
Experian
created
systems
that
could
take
data
from
markeAng
channels
real
Ame,
process
and
react
with
the
required
informaAon
under
the
average
human
reacAon
Ame
of
200
milliseconds,
correlate
against
historical
and
predicAve
data
and
respond
to
the
client
–
within
3-‐5
milliseconds!
hIp://wso2.com/blogs/thesource/2015/10/wso2con-‐insights-‐experian-‐uses-‐wso2-‐to-‐uncover-‐credit-‐
intelligence/
Big
data
insight
as
an
asset
12. Using
WSO2
AnalyAcs,
Experian
created
systems
that
could
take
data
from
markeAng
channels
real
Ame,
process
and
react
with
the
required
informaAon
under
the
average
human
reacAon
Ame
of
200
milliseconds,
correlate
against
historical
and
predicAve
data
and
respond
to
the
client
–
within
3-‐5
milliseconds!
hIp://wso2.com/blogs/thesource/2015/10/wso2con-‐insights-‐experian-‐uses-‐wso2-‐to-‐uncover-‐credit-‐
intelligence/
Big
data
insight
as
an
asset
Business
focus:
• Data
everywhere
• Intelligent
enterprise
Technology
focus
• Big
data
analyAcs
including
• Batch
analyAcs
• Streaming
fast
data
analyAcs
• PredicAng
analyAcs
and
machine
learning
• InteracAve
analyAcs
• GeospaAal
analyAcs
14. The
workforce,
reimagined
Business
focus:
• Cross
team
collaboraAon
• CreaAvity
as
a
core
value
• Agile
development
• Meritocracy
model
• Micro
services
teams
• The
Connected
Enterprise
Technology
focus
• Open
Source
–
internal
and
external
• IteraAve
architecture
• IntegraAon
pla^orm
used
to
connect
stakeholders
15. Legacy
Systems,
Cloud
Services,
Data
Sources
Stakeholders,
Partners,
ApplicaAons,
Devices,
Systems
On-‐premise,
Cloud,
Managed
Cloud,
Hybrid
Deployments
(Bare-‐metal,
VM,
Containers,
IaaS)
1. Customers
drive
business
2. Devices
and
IoT
are
stakeholders
3. Partners
contribute
to
pla^orm
business
model
4. Rapid
deployments
on
hybrid
IaaS,
PaaS
environments
5. ContainerizaAon
for
web
scale
deployment
6. IntegraAon,
Services,
Micro-‐
integraAon
and
micro-‐services
enable
rapid
innovaAon
7. Security
is
not
an
aherthought!
8. Managed
APIs
encourage
an
API
economy
9. ConsumerizaAon
of
IT
is
key
for
customer
centric
business
10. AnalyAcs
drives
business
funcAons
Integra:on
Connectors
Data
Services
Workflows
Services
OrchestraAon
TransformaAon
Micro
Services
Macro
Services
Messaging
Security
API
Management
App
Dev
and
Management
Mobile
and
IoT
Asset
Store
(API,
Apps,
Services
etc.)
Mobile
Device
Management
Mobile
ApplicaAon
Management
IoT
Device
Management
Governance
Dashboards
and
Stores
Analy:cs
Batch
Streaming
PredicAve
Building a platform for transformation
16. A Technology Platform for Digital Transformation
1
● 100% Open Source
● Fully integrated,
complete
middleware
platform
● 26 products
available, so that
you deploy only
what you need,
when you need it
● Built from scratch,
based on a single
code-base/platform
● Works seamlessly
across the cloud
and on-premise