This document discusses artificial intelligence and machine learning. It begins with an outline covering AI revolution, methods and protocols, and a call to action. It then discusses the spectacular investment and performance acceleration in AI. Next, it provides examples of AI applications in various industries. It describes today's AI toolbox, including various machine learning techniques. It stresses the importance of data collection for AI strategies and provides recommendations for how organizations can take action and grow AI success.
Information Systems for Digital TransformationYves Caseau
Keynote at "Complex Systems Design and Management"
Exponential Information Systems as the back bone for Digital Transformation. This talk adresses the goals and the challenges of transforming IS into platforms that support their company's digital ambitions.
"Designing Better Machines: Evolution of a cognitive Digital Twin"
Industry 4.0 Meets Industrial Internet of Things Forum at Hannover Messe 2018 with IBM Watson IoT CTO Sky Matthews @blueskyflash @IBMIoT #HM18 #IBM #WatsonIoT
Cognitive Digital Twin by Fariz SaračevićBosnia Agile
Data are driving the world today and they are becoming world's precious currency. Continuous Engineering, the default set of applications for enterprise software development, produce a wealth of data but it is hard to understand its value. What if you could find hidden patterns in your data your development teams create? What if you could discover ways to improve your team's performance? This presentation reviewed some of the different ways the Collaborative Lifecycle Management team (http://jazz.net) is utilizing Watson Analytics to gain insights into and improve efficiency with their own processes.
Public talk delivered to Bouygues Telecom corporate customers in 2013 - About Quantified self, e-health and well being, connected objects and ecosystems
Information Systems for Digital TransformationYves Caseau
Keynote at "Complex Systems Design and Management"
Exponential Information Systems as the back bone for Digital Transformation. This talk adresses the goals and the challenges of transforming IS into platforms that support their company's digital ambitions.
"Designing Better Machines: Evolution of a cognitive Digital Twin"
Industry 4.0 Meets Industrial Internet of Things Forum at Hannover Messe 2018 with IBM Watson IoT CTO Sky Matthews @blueskyflash @IBMIoT #HM18 #IBM #WatsonIoT
Cognitive Digital Twin by Fariz SaračevićBosnia Agile
Data are driving the world today and they are becoming world's precious currency. Continuous Engineering, the default set of applications for enterprise software development, produce a wealth of data but it is hard to understand its value. What if you could find hidden patterns in your data your development teams create? What if you could discover ways to improve your team's performance? This presentation reviewed some of the different ways the Collaborative Lifecycle Management team (http://jazz.net) is utilizing Watson Analytics to gain insights into and improve efficiency with their own processes.
Public talk delivered to Bouygues Telecom corporate customers in 2013 - About Quantified self, e-health and well being, connected objects and ecosystems
A look at how cognitive computing is driving new productivity and gains in the manufacturing industry. TO learn more: http://www.ibm.com/internet-of-things/iot-solutions/connected-manufacturing/
Big Data Expo 2015 - Cisco Connected AnalyticsBigDataExpo
The presentation will describe the Internet of Everything technology transition, where people, process, data and things are coming together to unleash 14,4 Trillion dollars global economic value.
The question is how do we capture this value by connecting the unconnected, while carving out actionable, replicable insights from Big Data ? The speech will include practical cases on how enterprises – including Cisco – and public sector agencies are able today to unleash economic, social and environmental value through data-intensive, new IT consumption models
Over the past decade, cloud computing has acted as a disrupter in several areas of IT business. Soon, it will overhaul one area of technology that has been in rapid growth itself: Data Analytics. Nicky will focus on the recent study of IBM Institute of Business Value which shows that capabilities that enable an organization to consume data faster – to move from raw data to insight-driven actions – are now the key differentiator to creating value using data and analytics. He will also talk about the requirements for the underlying infrastructure as critical component allowing real-time crunching and analysis of high volume of data. Based on real cases like retailers and energy companies, we will look at five predictions in five years, based on:
Analytics, Big data, and Cloud coming together will energize the Speed Advantage.
Digital Twin - What is it and how can it help us?Shaun West
RQ: What services can be provided (by whom and to who) through (or adopting, or developing) the digital twin concepts?
Our focus is long-life capital equipment
Consider the whole life cycle
Apply Service Dominant Logic in the assessment
Consider technical and business hierarchies
IoT Meets Big Data: The Opportunities and Challenges by Syed Hoda of ParStreamgogo6
Download our special report, IoT Tech for the Manager: http://bit.ly/report1-slideshare
IoT Meets Big Data: The Opportunities and Challenges as presented at the IoT Inc Business' Eighth Meetup. See: http://www.iot-inc.com/iot-meets-big-data-the-opportunities-and-challenges/
In our eighth Meetup we have Syed Hoda, Chief Marketing Officer of ParStream presenting “IoT Meets Big Data: The Opportunities and Challenges”. Come meet other business leaders in the IoT ecosystem and discuss the business issues you face in the Internet of Things.
Presentation Abstract
The Internet of Things (IoT) and Big Data have each made press headlines and continue to be board-level priorities. The intersection of IoT and Big Data is a fascinating area of innovation with tremendous scope for business impact. From industrial sensors to vehicles to health monitors, a huge variety of devices connects to the Internet and share information. At the same time, the cost to store data has dropped dramatically while capabilities for analysis have made huge leaps forward. How can analytics drive business benefits from IoT projects? What are the challenges in storing and analyzing huge amounts of real-world information? How can companies generate more value from their data? We will address these questions and also share our perspectives on innovative technologies enabling new IoT use cases.
The Analytics Value Chain - Key to Delivering Business Value in IoTPeter Nguyen
In the IoT, new analytics approaches are needed to handle the time critical nature of IoT challenges.
The first step in designing a new approach is to simplify the process by integrating all the data for an IoT application. That includes all the structured, unstructured, and semi-structured data in your organization.
The second key step in the streamlining process is to unify the analytics layer.
This includes historical analytics (descriptive & diagnostic), real-time streaming analytics, predictive analytics, and prescriptive analytics.
The approach to analytics outlined above is a good first step for IoT. However, it is the ability to execute analytics in real-time across the analytics value chain (streaming, historical, predictive, and prescriptive analytics) with relevant contextual and situational data that addresses the critical “last mile” for timely outcomes.
Then this must be combined with the ability to take the next best action in any particular scenario to create the greatest value.
Presentation that was developed for IoT DevCon in April 2017, Santa Clara California USA.
In this deck, I go though the value of data & derivative (analytics) form an economic point of value, and then connect to how we can package that value for monetization in the context of IoT. The key technology enablement is the Digital Twin, which is describe as well as platform that support this paradigm. It ends with recommendations on how to get ready and how to start using GE Digital Predix Platform.
The Fog or Edge Computing model complements Cloud Computing with small, typically sensor-enabled and IOT connected devices that process distributed data at its source. As this model matures, we see an uptake on a 3-tier architecture with Intelligent Gateways to aggregate sensor input before communicating with data centers or a Cloud. Two forces will drive the practice of distributing Intelligence (Understanding/Reasoning/Learning) to the Gateway. The first is the presence of the Gateway itself, which enables a standards-based approach to distributing intelligence and moving it closer to the edge. The second is the trend for simplifying system requirements by processing training data or model validation with big data prior to deployment, and using small footprint devices for operational systems.
This webinar will present an overview of the relevant technologies and trends. Participants will learn about the state of the art today, and how to identify apps in their own environment that would be good candidates for Intelligent Edge solutions.
From Alexa and Siri to factory robots and financial chatbots, intelligent systems are reshaping industries. But the biggest changes are still to come, giving companies time to create winning AI strategies
The technical coordinators’ overview on the project, that addresses background aspects on AI and the project motivation for explainable AI. This presentation then goes a bit deeper on the project concept and vision, explaining the business expert, data scientist and data engineer perspectives on how they can interact with XMANAI, and provides a roadmap for the partners.
A look at how cognitive computing is driving new productivity and gains in the manufacturing industry. TO learn more: http://www.ibm.com/internet-of-things/iot-solutions/connected-manufacturing/
Big Data Expo 2015 - Cisco Connected AnalyticsBigDataExpo
The presentation will describe the Internet of Everything technology transition, where people, process, data and things are coming together to unleash 14,4 Trillion dollars global economic value.
The question is how do we capture this value by connecting the unconnected, while carving out actionable, replicable insights from Big Data ? The speech will include practical cases on how enterprises – including Cisco – and public sector agencies are able today to unleash economic, social and environmental value through data-intensive, new IT consumption models
Over the past decade, cloud computing has acted as a disrupter in several areas of IT business. Soon, it will overhaul one area of technology that has been in rapid growth itself: Data Analytics. Nicky will focus on the recent study of IBM Institute of Business Value which shows that capabilities that enable an organization to consume data faster – to move from raw data to insight-driven actions – are now the key differentiator to creating value using data and analytics. He will also talk about the requirements for the underlying infrastructure as critical component allowing real-time crunching and analysis of high volume of data. Based on real cases like retailers and energy companies, we will look at five predictions in five years, based on:
Analytics, Big data, and Cloud coming together will energize the Speed Advantage.
Digital Twin - What is it and how can it help us?Shaun West
RQ: What services can be provided (by whom and to who) through (or adopting, or developing) the digital twin concepts?
Our focus is long-life capital equipment
Consider the whole life cycle
Apply Service Dominant Logic in the assessment
Consider technical and business hierarchies
IoT Meets Big Data: The Opportunities and Challenges by Syed Hoda of ParStreamgogo6
Download our special report, IoT Tech for the Manager: http://bit.ly/report1-slideshare
IoT Meets Big Data: The Opportunities and Challenges as presented at the IoT Inc Business' Eighth Meetup. See: http://www.iot-inc.com/iot-meets-big-data-the-opportunities-and-challenges/
In our eighth Meetup we have Syed Hoda, Chief Marketing Officer of ParStream presenting “IoT Meets Big Data: The Opportunities and Challenges”. Come meet other business leaders in the IoT ecosystem and discuss the business issues you face in the Internet of Things.
Presentation Abstract
The Internet of Things (IoT) and Big Data have each made press headlines and continue to be board-level priorities. The intersection of IoT and Big Data is a fascinating area of innovation with tremendous scope for business impact. From industrial sensors to vehicles to health monitors, a huge variety of devices connects to the Internet and share information. At the same time, the cost to store data has dropped dramatically while capabilities for analysis have made huge leaps forward. How can analytics drive business benefits from IoT projects? What are the challenges in storing and analyzing huge amounts of real-world information? How can companies generate more value from their data? We will address these questions and also share our perspectives on innovative technologies enabling new IoT use cases.
The Analytics Value Chain - Key to Delivering Business Value in IoTPeter Nguyen
In the IoT, new analytics approaches are needed to handle the time critical nature of IoT challenges.
The first step in designing a new approach is to simplify the process by integrating all the data for an IoT application. That includes all the structured, unstructured, and semi-structured data in your organization.
The second key step in the streamlining process is to unify the analytics layer.
This includes historical analytics (descriptive & diagnostic), real-time streaming analytics, predictive analytics, and prescriptive analytics.
The approach to analytics outlined above is a good first step for IoT. However, it is the ability to execute analytics in real-time across the analytics value chain (streaming, historical, predictive, and prescriptive analytics) with relevant contextual and situational data that addresses the critical “last mile” for timely outcomes.
Then this must be combined with the ability to take the next best action in any particular scenario to create the greatest value.
Presentation that was developed for IoT DevCon in April 2017, Santa Clara California USA.
In this deck, I go though the value of data & derivative (analytics) form an economic point of value, and then connect to how we can package that value for monetization in the context of IoT. The key technology enablement is the Digital Twin, which is describe as well as platform that support this paradigm. It ends with recommendations on how to get ready and how to start using GE Digital Predix Platform.
The Fog or Edge Computing model complements Cloud Computing with small, typically sensor-enabled and IOT connected devices that process distributed data at its source. As this model matures, we see an uptake on a 3-tier architecture with Intelligent Gateways to aggregate sensor input before communicating with data centers or a Cloud. Two forces will drive the practice of distributing Intelligence (Understanding/Reasoning/Learning) to the Gateway. The first is the presence of the Gateway itself, which enables a standards-based approach to distributing intelligence and moving it closer to the edge. The second is the trend for simplifying system requirements by processing training data or model validation with big data prior to deployment, and using small footprint devices for operational systems.
This webinar will present an overview of the relevant technologies and trends. Participants will learn about the state of the art today, and how to identify apps in their own environment that would be good candidates for Intelligent Edge solutions.
From Alexa and Siri to factory robots and financial chatbots, intelligent systems are reshaping industries. But the biggest changes are still to come, giving companies time to create winning AI strategies
The technical coordinators’ overview on the project, that addresses background aspects on AI and the project motivation for explainable AI. This presentation then goes a bit deeper on the project concept and vision, explaining the business expert, data scientist and data engineer perspectives on how they can interact with XMANAI, and provides a roadmap for the partners.
Ferma report: Artificial Intelligence applied to Risk Management FERMA
FERMA brought together a group of experts from within and beyond the risk management community to develop the first thought paper about AI applied to risk management.
Their aim was to perform an initial assessment of the potential value of AI to improve enterprise risk management (ERM), and second, to understand how risk managers can be key actors in highlighting to the organisation leadership the opportunities and challenges of AI technologies.
The working group expects that corporate risk management will benefit from AI in several areas. “From its ability to process large amounts of data to the automation of certain risk management repetitive and burdensome steps, AI could allow risk managers to respond faster to new and emerging exposures. By acting in real time and with some predictive capabilities, risk management could reach a new level in supporting better decision making for senior management.”
This paper aims to guide risk managers on applying AI from a basic understanding to developing their own strategy on the implementation of AI. It includes an action guide and a template for risk managers to develop their own AI risk management roadmap.
How AI and ML Can Optimize the Supply Chain.pdfGlobal Sources
Artificial intelligence (AI) and machine learning (ML) were already buzzwords in the technology and manufacturing spheres before the pandemic upended the global supply chain. Ironically, with the disruption from the health crisis the push toward translating them into reality has become stronger.
Although there is still a huge gap between “ambition and execution,” as industry analysts put it, the AI and ML promises of higher productivity and better resilience cannot be ignored. A few have started adopting the technologies and many more are expected to follow and reap the benefits of a highly integrated system in the coming years.
Global Sources‘ latest e-book, How Artificial Intelligence & Machine Learning Can Optimize the Supply Chain, explores the potential benefit of technology on key areas, such as data collection and analysis, supply chain optimization, cost reduction, forecasting and planning. It offers a roadmap to augmentation and automation, and how this will help speed up operations, boost efficiency and build resilience. The book also covers challenges posed by the adoption of artificial intelligence and machine learning in current setups, and how they can be overcome.
Read more about the advantages of adopting a highly integrated system using artificial intelligence and machine learning.
Download here to get a free copy of How Artificial Intelligence & Machine Learning Can Optimize the Supply Chain.
Semantic Artificial Intelligence is the fusion of various types of AI, incl. symbolic AI, reasoning, and machine learning techniques like deep learning. At the same time, Semantic AI has a strong focus on data management and data governance. With the 'wedding' of various AI techniques new promises are made, but also fundamental approaches like 'Explainable AI (XAI)', knowledge graphs, or Linked Data are more strongly focused.
Accelirate Inc. has created a guide for getting started with artificial intelligence. From enterprise use cases, to the technology involved, and even how to build a world class RPA and AI team, everyone can benefit from this all-inclusive guide to AI.
In today's tech-driven world, the integration of artificial intelligence (AI) into applications has become increasingly prevalent. From personalized recommendations to intelligent chatbots, AI enhances user experiences and optimizes processes. However, building an AI app can seem daunting to those unfamiliar with the process. Fear not! This guide aims to demystify the journey, offering step-by-step insights into how to build an AI app from scratch.
Artificial Intelligence is trendy. Every event, every strategy meeting and every consulting firm talks about it. This whitepaper aims to separate actual facts and important background information from the overarching marketing buzz.
You will get a short but information-rich wrap up about: What causes the current hype? Where are we today? What are the innovation leaders doing with AI? And what are immediate action points to focus on by applying artificial intelligence to your business?
AI & Cognitive Computing are some of the most popular business an technical words out there. It is critical to get the basic understanding of Cognitive Computing, which helps us appreciate the technical possibilities and business benefits of the technology.
This second machine age has seen the rise of artificial intelligence (AI), or “intelligence” that is not the result of
human cogitation. It is now ubiquitous in many commercial products, from search engines to virtual assistants. aI is the result of exponential growth in computing power, memory capacity, cloud computing, distributed and parallel processing, open-source solutions, and global connectivity of both people
and machines. The massive amounts and the speed at which structured and unstructured (e.g., text, audio, video, sensor) data is being generated has made a necessity of speedily processing and generating meaningful, actionable insights from it.
The State of Artificial Intelligence in 2018: A Good Old Fashioned ReportNathan Benaich
Artificial intelligence (AI) is a multidisciplinary field of science whose goal is to create intelligent machines.
We believe that AI will be a force multiplier on technological progress in our increasingly digital, data-driven world.
This is because everything around us today, ranging from culture to consumer products, is a product of intelligence.
In this report, we set out to capture a snapshot of the exponential progress in AI with a focus on developments in the past 12 months. Consider this report as a compilation of the most interesting things we’ve seen that seeks to trigger informed conversation about the state of AI and its implication for the future.
We consider the following key dimensions in our report:
Research: Technology breakthroughs and their capabilities.
Talent: Supply, demand and concentration of talent working in the field.
Industry: Large platforms, financings and areas of application for AI-driven innovation today and tomorrow.
Politics: Public opinion of AI, economic implications and the emerging geopolitics of AI.
Collaboratively produced in East London, UK by:
- Nathan Benaich, Founder of Air Street Capital (www.airstreet.com) and RAAIS (www.raais.co).
- Ian Hogarth, Visiting Professor at UCL's IIPP (https://www.twitter.com/IIPP_UCL) and angel investor.
This emerging tech research from CompTIA describes the growing role of artificial intelligence in the technology strategies that businesses are building.”
In the Dark? Understanding Big Data & AI: Talent Acquisition Strategies for 2018Yoh Staffing Solutions
Big Data and AI have changed the way companies acquire people. Is your organization one of them? Shed some light on this innovation with these valuable tips and gain a better understanding of the implications Big Data and AI can have on your talent acquisition strategy.
This presentation introduces an Earth Model, CCEM (Coupling Coarse Earth Models), which is a system dynamic simulation model representing the earth as a complex system and focusing on feedback loops associated with global warming. CCEM combines five simpler models, addressing energy availability, economic adjustment to energy scarcity, energy transition, global economy and CO2 emissions, and the impact of CO2 emissions on warming and society. The model aims to make implicit beliefs explicit and demonstrate that the same mental model can support various viewpoints by changing beliefs associated with "known unknowns." Five "known unknowns" discussed in the text include the future availability and cost of energy, energy needs and affordability for the economy, the speed of energy substitution, expected GDP growth, and the economic and societal consequences of global warming.
This talk is about data-driven transformation and its contribution to Digital transformation. The first part shows the necessity to adopt the "software revolution" to adapt constantly to the customer’s environment. I then speak about " Exponential Information Systems" that the the foundation for the data-driven ambitions : Enterprise-wide flows, Customer-time data freshness, Future-proof unified semantics, etc.
The last part talks about Exponential Technologies, such as Artificial intelligence and machine learning, to drive more value from data
This is an old simulation project in the field of "Global Warming Serious Games". This preliminary model is shared because the author plans to resume his work on this topic using Evolutionary Game Theory
Présentation du 23 Janvier lors de la journée MEDEF / AFIA sur l'Intelligence Artificielle - recommandations aux entreprises sur la base du groupe de travail de l'Académie des Technologies
Talk given during the "Management and Social Networks" conference in Geneva (2012). Towards a "theory of meeting", with a focus on meeting systems, efficiency, affiliation network, information propagation.
Theory of Meeting, Affiliation Networks, Social Networks, contact frequency. A 2008 presentation about computer models to better understand the efficency of meetings
Managing Business Processes Communication and Performance Yves Caseau
Presentation at ICORES 2012 on Enterprise models.
This talk presents a computational model of a generic enterprise (BPEM, which stands for Business Process Enterprise Model), based upon the core concept of business process. BPEM may be seen as a bridge between two worlds of “Enterprise Models”, the world of mathematical models, formal and fully operational for optimization purposes and the world of conceptual models (boxes & arrows type) for management science, for reasoning and communicating about what a company is.
Enterprise 3.0 Principles : common traits of new forms of enterprise organization, as seen in multiple new books published in the past ten years, such as Reinventing Organizations or Freedom, Inc.
Serious Games as a Tool to Understand Complexity in Market Competition: An Evolutionary Game Theory Simulation Platform
Presentation to Labex MS2T, UTC Compiegne
Strategies for Successful Data Migration Tools.pptxvarshanayak241
Data migration is a complex but essential task for organizations aiming to modernize their IT infrastructure and leverage new technologies. By understanding common challenges and implementing these strategies, businesses can achieve a successful migration with minimal disruption. Data Migration Tool like Ask On Data play a pivotal role in this journey, offering features that streamline the process, ensure data integrity, and maintain security. With the right approach and tools, organizations can turn the challenge of data migration into an opportunity for growth and innovation.
Your Digital Assistant.
Making complex approach simple. Straightforward process saves time. No more waiting to connect with people that matter to you. Safety first is not a cliché - Securely protect information in cloud storage to prevent any third party from accessing data.
Would you rather make your visitors feel burdened by making them wait? Or choose VizMan for a stress-free experience? VizMan is an automated visitor management system that works for any industries not limited to factories, societies, government institutes, and warehouses. A new age contactless way of logging information of visitors, employees, packages, and vehicles. VizMan is a digital logbook so it deters unnecessary use of paper or space since there is no requirement of bundles of registers that is left to collect dust in a corner of a room. Visitor’s essential details, helps in scheduling meetings for visitors and employees, and assists in supervising the attendance of the employees. With VizMan, visitors don’t need to wait for hours in long queues. VizMan handles visitors with the value they deserve because we know time is important to you.
Feasible Features
One Subscription, Four Modules – Admin, Employee, Receptionist, and Gatekeeper ensures confidentiality and prevents data from being manipulated
User Friendly – can be easily used on Android, iOS, and Web Interface
Multiple Accessibility – Log in through any device from any place at any time
One app for all industries – a Visitor Management System that works for any organisation.
Stress-free Sign-up
Visitor is registered and checked-in by the Receptionist
Host gets a notification, where they opt to Approve the meeting
Host notifies the Receptionist of the end of the meeting
Visitor is checked-out by the Receptionist
Host enters notes and remarks of the meeting
Customizable Components
Scheduling Meetings – Host can invite visitors for meetings and also approve, reject and reschedule meetings
Single/Bulk invites – Invitations can be sent individually to a visitor or collectively to many visitors
VIP Visitors – Additional security of data for VIP visitors to avoid misuse of information
Courier Management – Keeps a check on deliveries like commodities being delivered in and out of establishments
Alerts & Notifications – Get notified on SMS, email, and application
Parking Management – Manage availability of parking space
Individual log-in – Every user has their own log-in id
Visitor/Meeting Analytics – Evaluate notes and remarks of the meeting stored in the system
Visitor Management System is a secure and user friendly database manager that records, filters, tracks the visitors to your organization.
"Secure Your Premises with VizMan (VMS) – Get It Now"
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...Anthony Dahanne
Les Buildpacks existent depuis plus de 10 ans ! D’abord, ils étaient utilisés pour détecter et construire une application avant de la déployer sur certains PaaS. Ensuite, nous avons pu créer des images Docker (OCI) avec leur dernière génération, les Cloud Native Buildpacks (CNCF en incubation). Sont-ils une bonne alternative au Dockerfile ? Que sont les buildpacks Paketo ? Quelles communautés les soutiennent et comment ?
Venez le découvrir lors de cette session ignite
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Software Engineering, Software Consulting, Tech Lead.
Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Security,
Spring Transaction, Spring MVC,
Log4j, REST/SOAP WEB-SERVICES.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
Advanced Flow Concepts Every Developer Should KnowPeter Caitens
Tim Combridge from Sensible Giraffe and Salesforce Ben presents some important tips that all developers should know when dealing with Flows in Salesforce.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamtakuyayamamoto1800
In this slide, we show the simulation example and the way to compile this solver.
In this solver, the Helmholtz equation can be solved by helmholtzFoam. Also, the Helmholtz equation with uniformly dispersed bubbles can be simulated by helmholtzBubbleFoam.
How Does XfilesPro Ensure Security While Sharing Documents in Salesforce?XfilesPro
Worried about document security while sharing them in Salesforce? Fret no more! Here are the top-notch security standards XfilesPro upholds to ensure strong security for your Salesforce documents while sharing with internal or external people.
To learn more, read the blog: https://www.xfilespro.com/how-does-xfilespro-make-document-sharing-secure-and-seamless-in-salesforce/
Listen to the keynote address and hear about the latest developments from Rachana Ananthakrishnan and Ian Foster who review the updates to the Globus Platform and Service, and the relevance of Globus to the scientific community as an automation platform to accelerate scientific discovery.
Why React Native as a Strategic Advantage for Startup Innovation.pdfayushiqss
Do you know that React Native is being increasingly adopted by startups as well as big companies in the mobile app development industry? Big names like Facebook, Instagram, and Pinterest have already integrated this robust open-source framework.
In fact, according to a report by Statista, the number of React Native developers has been steadily increasing over the years, reaching an estimated 1.9 million by the end of 2024. This means that the demand for this framework in the job market has been growing making it a valuable skill.
But what makes React Native so popular for mobile application development? It offers excellent cross-platform capabilities among other benefits. This way, with React Native, developers can write code once and run it on both iOS and Android devices thus saving time and resources leading to shorter development cycles hence faster time-to-market for your app.
Let’s take the example of a startup, which wanted to release their app on both iOS and Android at once. Through the use of React Native they managed to create an app and bring it into the market within a very short period. This helped them gain an advantage over their competitors because they had access to a large user base who were able to generate revenue quickly for them.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
Accelerate Enterprise Software Engineering with Platformless
Taking advantageofai july2018
1. Artificial Intelligence and Machine Learning – July 2018 1/14
Yves Caseau
National Academy of Technologies
Michelin CIO
Taking Advantage of AI
July 11th
, 2018
V0.4
2. Artificial Intelligence and Machine Learning – July 2018 2/14
OutlineOutline
Artificial Intelligence and
Machine Learning « revolution »
A glance at the « toolbox »:
methods, protocols and assembly
A « how to guide » for corporation
How to grow « emergence » ?
1:AI“renewal”1:AI“renewal”2:The« Toolbox »2:The« Toolbox »3:Calltoaction3:Calltoaction
3. Artificial Intelligence and Machine Learning – July 2018 3/14
AI « renewal » /AI « renewal » /
Technology academy workgroupTechnology academy workgroup
Spectacular Investment Acceleration
Major players and venture capital
Belief that major benefits are yet to
come
Spectacular Performance Acceleration
Image, speech recognition,
translation, ….
Alpha Go, etc.
Moore’s law does not explain
everything
Workgroup questions
Revolution or evolution ?
AI Algorithms = commodity ?
« Exponential Organization » ?
1:AI“renewal”1:AI“renewal”
4. Artificial Intelligence and Machine Learning – July 2018 4/14
Taking advantage of AI availabilityTaking advantage of AI availability
Vaccine Manufacturing at Merck
5 Terabytes in a “Datalake”
Batch yield optimization
IHG Continental (hotels)
ultra-fine customer segmentation client
Similar approach at Amadeus
Ejection fraction analysis (cardiology)
Contest prepared by doctors and cardiologists
DNN to compute a volume through image analysis
FAA
Long term delay forecast through a Bayesian
network
Taking “avalanche effect” into accounts
5 years of data, 52 millions flights – noisy data
1:AI“renewal”1:AI“renewal”
5. Artificial Intelligence and Machine Learning – July 2018 5/14
1:AI“renewal”1:AI“renewal”
Most AI application are built on top of a feedback loopMost AI application are built on top of a feedback loop
Iterative
Developement
of AI Practice
Speed of learning
depends on
computing power
Smart
Algorithms
Smart
Engineering Smart
Services
Service
Usage
Growing
Large
Datasets
Distributed Software
Engineering Practices
Distributed Software
Engineering Practices
Management Vision
& Grit
Management Vision
& Grit
Ease to collectEase to collect Trust &
Acceptability
Trust &
Acceptability
A
Scientists
Open source
Startups
B-
Lack of SW
medium
sized
players
C+
Risk-adverse
Lack of SW
culture
B+
Market
Size /
language
B-
GDPR
CNIL
B+
Competitive
access to
GPU/TPU
6. Artificial Intelligence and Machine Learning – July 2018 6/14
Today’s Artificial Intelligence (& ML) makes an extended toolboxToday’s Artificial Intelligence (& ML) makes an extended toolbox
Open
question
Question
précise
Few data Lots of data
classical
« Data Science »
methods
Rules
OR / NLP
Agents
Evolutionary Game Theory
Deep Learning
(CNN)
Semantics
(e.g., Watson)
• Rule-based and
constraint-
based (e.g.,
configuration)
• Fuzzy
boundary with
operations
research
• Most companies
“AI use cases”
• Einstein /
TellMePlus /
Da Vinci Labs
• Moore’s Law &
Big Data
• Key role of
simulation
• Well suited to
complex
systems
• Continuous
but slow
progress
• News articles
written by
robots
• Pattern/situation
recognition
• This decade’s
inflexion point
Intelligence Artificielle et Apprentissage Automatique –May 2018 8/16
Quelques éléments clés de la boîte-à-outils
l Regression linéaire
/ logistique
l Réseau Bayésien
l Régularisation
l (K-mean) clustering
l Random Forest
l Gradient Boosting
l Support-Vector
Machines
l Réseaux
Neuronaux
l Ontologies
l Lexicographie
l ARMA, ARIMA, etc.
2:La“boïte-à-outils
2:The« Toolbox »2:The« Toolbox »
7. Artificial Intelligence and Machine Learning – July 2018 7/14
Some key pieces in the toolboxSome key pieces in the toolbox
l Linear / logistic
regression
l Bayesian networks
l Regularization
l (K-mean) clustering
l Random Forest
l Gradient Boosting
l Support-Vector
Machines
l Neural Networks
l Ontologies
l Lexicographic tools
l Rule-bases scripting
l ARMA, ARIMA, etc.
2:The« Toolbox »2:The« Toolbox »
8. Artificial Intelligence and Machine Learning – July 2018 8/14
Meta-Heuristics to mix these componentsMeta-Heuristics to mix these components
l Reinforcement learning
l Transfer learning
l Natural language processing toolbox
l Large-scale Intelligent agents
communities
l Game theory to reason about
competition and cooperation
• Hybrid AI: to combine
different tools and
meta-heuristics
• Generative approaches
2:The« Toolbox »2:The« Toolbox »
9. Artificial Intelligence and Machine Learning – July 2018 9/14
Cognitive Systems: Mixing various AI andCognitive Systems: Mixing various AI and
Machine Learning TechniquesMachine Learning Techniques
Smart System
Components:
Perception / environment
Self-consciousness of
goals
Forecast and adjust
Growth through usage
Biomimicry
Develop through
reinforcement
Add layered capabilities
for resilience
Cognitive computing
“reason from a purpose” –
IBM
“systems grow by
machine learning, not by
programmatic design”
EDA
Objects &
sensors
user
CEP
Reflexes
ACT
command
center
state
react
history
THINK
Decisions
(AI)
PLAN
Execution
Logic
goals
REFLECT
Evolutionary
ML
ANALYZE
Machine
Learning
ADAPT
Reactive
LEARN
Representation
VALUATION
emotions
services Other systems
Systems of
Systems
FORECAST
Anticipation
decide
insights
2:The« Toolbox »2:The« Toolbox »
10. Artificial Intelligence and Machine Learning – July 2018 10/14
AI strategy starts with data collectionAI strategy starts with data collection
Data collection process
Do not forget meta-data !
Build qualified training sets
“System thinking” (loop) :
collect tomorrow’s data as well as past
data
Prepare « machine vision » revolution
(& perception) through collecting images
and video … as well as customer digital
traces.
3:Calltoaction3:Calltoaction
Data
Lots of them, tagged
Algorithms
most often, open-source
Integration & meta-
heuristics
Training Protocols
Time & resourcesSkills / experience
11. Artificial Intelligence and Machine Learning – July 2018 11/14
Grow the success conditions for your teamsGrow the success conditions for your teams
Leverage the
« technology wave »
Beware of « false positives »
Overfitting, Spurious correlations, ..
… and of biases in training data
Mindset: distributed and emergent innovationMindset: distributed and emergent innovation
Data collection/ training setsData collection/ training sets
AI-friendly software environmentsAI-friendly software environments
Lab Culture (Data Science)Lab Culture (Data Science)
PerseverancePerseverance
Constant
flow of
software
It takes
time to
build skills
3:Calltoaction3:Calltoaction
12. Artificial Intelligence and Machine Learning – July 2018 12/14
Time to act is nowTime to act is now
Start right now with tools
that are easily available
Simple methods work
Take advantage of
« integrated/Automated »
toolboxes
Einstein, Holmes,
TellMePlus, etc.
Secure access to large-scale computing
power to increase the speed of learning
(GPU & TPU)
3:Calltoaction3:Calltoaction
Research &
Development
Digital
Manufacturing
Deliver
Product
Supply Chain
Assist
Customer
Pattern detectionPattern detection
Customer Interaction (e.g. Chatbots / Smart Assistants)Customer Interaction (e.g. Chatbots / Smart Assistants)
Operations Support / Information Systems
Digital Traces - IOT
Operations Support / Information Systems
Digital Traces - IOT
FraudFraud
Predictive maintenance
Quality Assurance
Automation Forecast / Optimization
Robotic Process AutomationRobotic Process Automation
Knowledge EngineeringKnowledge Engineering SearchSearch
13. Artificial Intelligence and Machine Learning – July 2018 13/14
To develop one’s situation potential (emergence)To develop one’s situation potential (emergence)
Artificial Intelligence is not a service that you buy,
it is a practical skill that one must grow.
It takes time …
Learning curve
To develop the kind of AI that is suited to one’s business
To work within a small team with outside experts (e.g., from academia)
To organize contests with business training sets
To build a continuous improvement process
Think Platform
Large scope vision
(upstream & downstream value chain)
« Win/win » : learn to share data
Example: Today’s “stupid” chatbots
collect data that will be used to train
tomorrow’s smart assistants
3:Calltoaction3:Calltoaction
14. Artificial Intelligence and Machine Learning – July 2018 14/14
Main take-awayMain take-away
These are the five domains that anyone
should start investigating without delays:
1.Smart Automation: RPA scripting tools,
Rule engines
2.Natural Language Processing:
Bots & ontologies
Sentiment analysis API
3.Pattern recognition :
Random Forests, Neural Nets
4.Forecasting : Machine
Learning Toolboxes /
Prediction API / Bayesian Networks
5.Machine Vision : play with CNN
(TensorFlow)
ConclusionConclusion
Editor's Notes
Bonjour à tous,
Je vais vous presenter les principaux messages du rapport
Organisé en 3 parties
Contexte par rapport au discours du président
Préconisation aux parties prenantes / complement au rapport de Villani
Conseil aux entreprises (la partie la plus originale du rapport)
(1) Donner d’autres exemples avec des courbesLe point clé est que nous avons dépassé les perfs humaiune
ImageNet challenge: better than the 5% of human performance with ML for Google and MS
http://www.eetimes.com/document.asp?doc_id=1325712
(2) Citer des chiffres clés
Equity deals to startups in artificial intelligence — including companies applying AI solutions to verticals like healthcare, advertising, and finance as well as those developing general-purpose AI tech — increased nearly 6x, from roughly 70 in 2011 to nearly 400 in 2015.
https://www.cbinsights.com/blog/artificial-intelligence-startup-funding-trends/
GAFIM : Google Amazon Facebook IBM Microsoft : billions of R&D dollars over a few years
(3) Réponse
Oui c’est une révolution car une rupture => effet immédiats et à venirAttention : le champs de l’IA est vaste, la maturité est inégale
Les algos sont des commodités, les données moins et les protocoles d’apprentissage pas du tout !
Attendre notre rapport … j’y reviens en conclusion
Le dernier exemple qui m’a le plus frappé est celui de Merck puisqu’il s’applique à l’optimisation de processus de production industriels qui m’intéresse professionnellement. « The manufacturing team used data science to conduct a large-scale analysis to integrate and analyze 5 terabytes of data using 15 billion calculations and more than 5.5 million batch-to-batch comparisons. They then created a “heat map” showing data clusters associated with high and low yields. Experts could look at the heat map, recommend changes, rework predictive models, and then run more analyses. … Merck uses a data lake for the petabytes of data its manufacturing plants generate. The data come in all formats, combining both in-house and outside data sets that extend backward up the production chain all the way to suppliers of raw materials. … In December 2016, it christened its first plant-wide analytics system in Singapore. A single dashboard will display real-time data flowing in from every part of the plant—manufacturing, tablet production, packaging, quality, warehousing, shipping, and so on.” Du point de vue de la CIO, la principale différence avec les outils précédents est le fait d’être passée d’une approche réactive à proactive : « We want to look at the data now and not wait until we have a problem ».
Un deuxième exemple intéressant est celui de la chaîne hôtelière IHG (Intercontinental) qui a utilisé des très gros volumes de données pour complètement revisiter sa segmentation client en produisant des dizaines de milliers de profils: “We concluded that advanced computation could identify more hidden relationships between customer attributes and likelihood-to-respond than is possible with… traditional modeling methods”.
Parmi ces exemples, j’ai relevé d’abord une utilisation de l’apprentissage par pour évaluer un flux (ejection fraction) à partir d’images vidéo : « Within three months, many of the teams had devised algorithms that enabled computers to read MRI cross sections as quickly as they are taken. The machines learned to find the specific image that shows the heart in its totally relaxed state (full of blood) and another in its totally contracted state (during pumping). They then compared the two and calculated the ejection fraction.” Ce qui est intéressant ici, c’est que si les données ont été préparées et annotées par des experts du sujet, l’équipe qui a produit le meilleur algorithme ne connaissait rien à la cardiologie : « The remarkable aspect about the winning team was that neither teammate knew anything about cardiology before the competition. Never before have organizations had at their disposal the global pool of talent to tackle the most complex problems of our time—including problems in fields of knowledge that data scientists know nothing about.”
L’exemple de la FAA qui a choisi d’analyser de façon globale une volume massif de données de vols est illustratif : « At the FAA, the team applied its computing horsepower to a data sample of 52 million flights over five years. The sample included 5.25 million rows of data. The computations were even more complicated than anticipated because the data were not clean; the Bayesian belief network was needed because it can estimate missing values amid all that complexity”. Je cite cet exemple car j’entends trop souvent dire qu’on ne peut utiliser l’apprentissage ou l’intelligence artificielle que sur des données nettoyées et exacte. Ce n’est pas le cas, on sait depuis longtemps appliquer des méthodes d’apprentissage sur des données bruitées, mais il faut en avoir conscience ! ce qui pose problèmes ce sont les données fausses alors qu’on croit qu’elles sont justes.
Définition intentionelle => sujet complexe definition en extension conduit à la boite à outils
Pour trier : deux axes :: (a) question précise / ouverte (b) peu de données / beaucoup de données
Panorama
1: frontières larges => discutable d’un point de vue conceptuel mais pratique d’un point de vue opérationnel
Cf article de la tribune de Bruno Maisonnier
https://www.latribune.fr/technos-medias/bruno-maisonnier-qualifier-d-intelligence-le-couple-deep-learning-et-reseaux-de-neurones-est-une-usurpation-773596.html
A terme : il faut hybdrider – cf. TODAI Robot
On pourrait aller plus bas et retrouver des éléments communs (optimization locale, descente de gradient, etc)
Mettre une photo de Sundar Pitchai et Google Agent (ML et ontologies)
Reinforcement learning => dans alphaGo mais aussi dans Libratus, de Carnegie Mellon (Poker)
Transfer learning : vision modulaire des réseaux profondexemple a Amadeus: apprendre avec un contexte complet puis utiliser avec un contexte plus limité
Progr-s constants: axes sémantique network + axe perception => système couplés
Loi de Moore aussi: voir Yves Demazeau / 1 million d’agents = voir aussi Cosmotech : modélisation de systèmes complexes
AXELROD
Todai Robot / aussi Andreesen Horrowitz en 2017
This picture is taken from my blog post about biomimetism
Template for event-driven architecture for smart systems such as smart home
Need to find three stories about this picture
This is a fractal design what you see here is a component !
same capabilities at all scale. Even simple objects (smart sensors) have perception / goals / computing capabilities Small systems are smart to generate fewer high value events
(2) Smart systems are grown by machine learning ! reinforce what get used ! Similarity with muscle and bio-engineering
Layered architecture: dumb functions for dumb task, reinforce resilience
Avoid the NEST catastrophy … or the Bill Gates house syndrom => PC down, no lights
(3) Quote IBM from John Kelly …. But also Kevin Kelly
Favorite quote by Kevin Kelly (20 years ago) :
« Investing machines with the ability to adapt on their own, to evolve in their own directions, and grow without human oversight is the next great advance in technology. Giving machines freedom is the only way we can have intelligent control. »
Importance of teonomy in system design – cf IBM
Message universel des experts : commencer par les données
Cf Google valorization of acquisiotions
Pierre Haren : IP value = test sets (annotated) and training protocols
(2) McKinsey : résultats tangibles et importants mais 20% seult pratiquent de facon commerciale, 40% experimentent
(3) Exemple de la lambda architecture un fondamental depuis 2010 / pas vraiment pratiqué
Profiter de l’exterieur => conditions à l’intérieur
De nombreux succès avec des méthodes simples, beaucoup de données et beaucoup de savoir faire
Pas IAaaS ! Au contraire fortement dépendante du domaine
4 exemples:
Ford: collection massive d’evenement sur des voitures instrumentées pour capturer des insights
Mesure du ”flux d’ejection” (ejection fraction) sur les images vidéo d’echo cardiographie / obtenue sur concours par des non-spécialistes (sur des données produites par des spécialistes/médecins)
Manufacturing process chez Merk : collection massive de tous les paramètres et enrichissement avec des données d’environement plus les données des fournisseurs de matériels (semblable à GE)
Segmentation “of one” (ou presque) pour créer programme de fidélisation chez IHG: Intercontinental Hotel Group / utilise du big data depuis longtemps / approche nouvelles (algo & machines & données massives) donne de bien meilleurs resultats
Déclinaison pratique des concepts précédents pour l’assurance
- une matrice pprocess client / axe temporel x du back office au contact client
- des sujets partout ! Des niveaux de maturité
- du bleu foncé au bleu clair
https://en.wikipedia.org/wiki/Robotic_process_automation
Déjà pas mal d’applictions
Large insurance companies
Fukoku Mutual Life Insurance http://mainichi.jp/english/articles/20161230/p2a/00m/0na/005000c
Exemple Alliance : Claim management
(2) Schéma très grossier à cause de la variété et de l’évolution permanente
Donc il faut se mettre en position d’évolution permanente
Savoir intégrer / utilise l’open source
Profiter de l’exterieur => conditions à l’intérieur
De nombreux succès avec des méthodes simples, beaucoup de données et beaucoup de savoir faire
Pas IAaaS ! Au contraire fortement dépendante du domaine
4 exemples:
Ford: collection massive d’evenement sur des voitures instrumentées pour capturer des insights
Mesure du ”flux d’ejection” (ejection fraction) sur les images vidéo d’echo cardiographie / obtenue sur concours par des non-spécialistes (sur des données produites par des spécialistes/médecins)
Manufacturing process chez Merk : collection massive de tous les paramètres et enrichissement avec des données d’environement plus les données des fournisseurs de matériels (semblable à GE)
Segmentation “of one” (ou presque) pour créer programme de fidélisation chez IHG: Intercontinental Hotel Group / utilise du big data depuis longtemps / approche nouvelles (algo & machines & données massives) donne de bien meilleurs resultats