Ph.D. Thesis: A Methodology for the Development of Autonomic and Cognitive In...Universita della Calabria,
Doctoral Defence in ICT (Università della Calabria, Italy). Ph.D. candidate Claudio Savaglio. Thesis title: A Methodology for the Development of Autonomic and Cognitive Internet of Things Ecosystems.
e-SIDES workshop at ICT 2018, Vienna 5/12/2018e-SIDES.eu
This document summarizes a session discussing how to build the next privacy and security research agenda for big data. The session included an introduction, a discussion of the e-SIDES community position paper and process for providing input, a mentimeter voting activity, and a panel on ensuring responsible research and innovation responds to real needs. The panel featured representatives from universities and research organizations discussing issues like integrating privacy from the start, understanding cultural and regional differences, and ensuring research aligns with societal values and needs. The position paper and future research agenda aim to provide recommendations for an ethically sound approach to big data.
Artificial intelligence has been a buzz word that is impacting every industry in the world. With the rise of such advanced technology, there will be always a question regarding its impact on our social life, environment and economy thus impacting all efforts exerted towards sustainable development. In the information era, enormous amounts of data have become available on hand to decision makers. Big data refers to datasets that are not only big, but also high in variety and velocity, which makes them difficult to handle using traditional tools and techniques. Due to the rapid growth of such data, solutions need to be studied and provided in order to handle and extract value and knowledge from these datasets for different industries and business operations. Numerous use cases have shown that AI can ensure an effective supply of information to citizens, users and customers in times of crisis. This paper aims to analyse some of the different methods and scenario which can be applied to AI and big data, as well as the opportunities provided by the application in various business operations and crisis management domains.
On Computer Science Trends and Priorities in PalestineMustafa Jarrar
On Computer Science Trends and Priorities in Palestine,
by Mustafa Jarrar
Computer Science
Birzeit University, Palestine
Personal Page: http://www.jarrar.info
At Workshop on ّIT Research Trends and Priorities
Islamic University of Gaza, Palestine
28 March, 2015
This document provides an overview of a presentation on deep learning given by Melanie Swan. The key points are:
1) Melanie Swan is a technology theorist who gave a presentation on deep learning and smart networks at a conference in Indianapolis.
2) She discussed the definition and technical details of deep learning, including how it is inspired by concepts from statistical mechanics and physics. Deep learning uses neural networks of processing units to model high-level abstractions in data.
3) Deep learning has many applications including image recognition, speech recognition, and question answering. It is seen as important due to the large worldwide spending on AI and the growth of data science jobs.
Finely Chair talk: Every company is an AI company - and why Universities sho...Amit Sheth
Video: https://youtu.be/ZS8rGSzb_9I
The context of this talk is this statement from the host institution's provost: "We are trying to mobilize our campus activities around AI.” I connect academic initiatives in Interdisciplinary AI with industry needs.
--- Original abstract -----
Every company now is an AI company: Now, Near Future, or Distant Future?
Amit Sheth, AI Institute, University of South Carolina
“Every company now is an AI company. The industrial companies are changing, the supply chain…every single sector, it’s not only tech.” said Steven Pagliuca, CEO of Bain Capital at the 2019 World Economic Forum. With this statement as the context, I will provide an overview of AI landscape -- what AI capabilities are for real, what is being oversold, what is nonexistent, what is unlikely in our lifetime. I will also provide an anecdote-supported review through a broad variety of current and eminent applications of AI that rely on some of the well-developed and emerging AI capabilities. The objective is to help those considering AI applications start thinking of new business opportunities, new products and services, and new revenue/business models in the context of rapid penetration of AI technologies everywhere. I will seek to answer: Is AI just hype or something already happening? If it has not happened in your industry, is it impending? Do bad impacts of AI outweigh the good?
Video at: https://www.linkedin.com/video/live/urn:li:ugcPost:6705141260845412352/
In this talk, we will review some of the challenges related to Industry 4.0 or Factory of Future, and how can Artificial Intelligence help address them.
Examples include the use of semantic interoperability and integration to support the use of sensor collected data in decision making, the use of computer vision to identify deviations in the process and manage quality, and the use of predictive algorithms for device maintenance.
Ph.D. Thesis: A Methodology for the Development of Autonomic and Cognitive In...Universita della Calabria,
Doctoral Defence in ICT (Università della Calabria, Italy). Ph.D. candidate Claudio Savaglio. Thesis title: A Methodology for the Development of Autonomic and Cognitive Internet of Things Ecosystems.
e-SIDES workshop at ICT 2018, Vienna 5/12/2018e-SIDES.eu
This document summarizes a session discussing how to build the next privacy and security research agenda for big data. The session included an introduction, a discussion of the e-SIDES community position paper and process for providing input, a mentimeter voting activity, and a panel on ensuring responsible research and innovation responds to real needs. The panel featured representatives from universities and research organizations discussing issues like integrating privacy from the start, understanding cultural and regional differences, and ensuring research aligns with societal values and needs. The position paper and future research agenda aim to provide recommendations for an ethically sound approach to big data.
Artificial intelligence has been a buzz word that is impacting every industry in the world. With the rise of such advanced technology, there will be always a question regarding its impact on our social life, environment and economy thus impacting all efforts exerted towards sustainable development. In the information era, enormous amounts of data have become available on hand to decision makers. Big data refers to datasets that are not only big, but also high in variety and velocity, which makes them difficult to handle using traditional tools and techniques. Due to the rapid growth of such data, solutions need to be studied and provided in order to handle and extract value and knowledge from these datasets for different industries and business operations. Numerous use cases have shown that AI can ensure an effective supply of information to citizens, users and customers in times of crisis. This paper aims to analyse some of the different methods and scenario which can be applied to AI and big data, as well as the opportunities provided by the application in various business operations and crisis management domains.
On Computer Science Trends and Priorities in PalestineMustafa Jarrar
On Computer Science Trends and Priorities in Palestine,
by Mustafa Jarrar
Computer Science
Birzeit University, Palestine
Personal Page: http://www.jarrar.info
At Workshop on ّIT Research Trends and Priorities
Islamic University of Gaza, Palestine
28 March, 2015
This document provides an overview of a presentation on deep learning given by Melanie Swan. The key points are:
1) Melanie Swan is a technology theorist who gave a presentation on deep learning and smart networks at a conference in Indianapolis.
2) She discussed the definition and technical details of deep learning, including how it is inspired by concepts from statistical mechanics and physics. Deep learning uses neural networks of processing units to model high-level abstractions in data.
3) Deep learning has many applications including image recognition, speech recognition, and question answering. It is seen as important due to the large worldwide spending on AI and the growth of data science jobs.
Finely Chair talk: Every company is an AI company - and why Universities sho...Amit Sheth
Video: https://youtu.be/ZS8rGSzb_9I
The context of this talk is this statement from the host institution's provost: "We are trying to mobilize our campus activities around AI.” I connect academic initiatives in Interdisciplinary AI with industry needs.
--- Original abstract -----
Every company now is an AI company: Now, Near Future, or Distant Future?
Amit Sheth, AI Institute, University of South Carolina
“Every company now is an AI company. The industrial companies are changing, the supply chain…every single sector, it’s not only tech.” said Steven Pagliuca, CEO of Bain Capital at the 2019 World Economic Forum. With this statement as the context, I will provide an overview of AI landscape -- what AI capabilities are for real, what is being oversold, what is nonexistent, what is unlikely in our lifetime. I will also provide an anecdote-supported review through a broad variety of current and eminent applications of AI that rely on some of the well-developed and emerging AI capabilities. The objective is to help those considering AI applications start thinking of new business opportunities, new products and services, and new revenue/business models in the context of rapid penetration of AI technologies everywhere. I will seek to answer: Is AI just hype or something already happening? If it has not happened in your industry, is it impending? Do bad impacts of AI outweigh the good?
Video at: https://www.linkedin.com/video/live/urn:li:ugcPost:6705141260845412352/
In this talk, we will review some of the challenges related to Industry 4.0 or Factory of Future, and how can Artificial Intelligence help address them.
Examples include the use of semantic interoperability and integration to support the use of sensor collected data in decision making, the use of computer vision to identify deviations in the process and manage quality, and the use of predictive algorithms for device maintenance.
This document summarizes two real-world use cases for digital preservation: embedded systems development and financial engineering. For the embedded systems case, the document describes research on developing an autonomous underwater vehicle and identifies relevant data formats, software tools, and hardware. For the financial engineering case, it outlines research calibrating an inflation model using treasury bond data from sources like Bloomberg and analyzing it in MATLAB. Both cases could benefit from emulation to help preserve workflows and access data over the long term.
Modern signal processing is dead without machine learning! 5th july 2020Dr G R Sinha
This lecture highlights role of Machine Learning in Modern Signal Processing Applications such as Driver-less Cars, Robotics, Smart Environment Monitoring, Healthcare etc.
From Aspiration to Reality: Open Smart Cities
Open smart cities might become a reality for Canada. Globally there are a number of initiatives, programs, and practices that are open smart city like which means that it is possible to have an open, responsive and engaged city that is both socio-technologically enabled, but also one where there is receptivity to and a willingness to grow a critically informed type of technological citizenship (Feenberg). For an open smart city to exist, public officials, the private sector, scholars, civil society and residents and citizens require a definition and a guide to start the exercise of imagining what an open smart city might look like. There is much critical scholarship about the smart city and there are many counter smart city narratives, but there are few depictions of what engagement, participatory design and technological leadership might be. The few examples that do exist are project based and few are systemic. An open smart city definition and guide was therefore created by a group of stakeholders in such a way that it can be used as the basis for the design of an open smart city from the ground up, or to help actors shape or steer the course of emerging or ongoing data and networked urbanist forms (Kitchin) of smart cities to lead them towards being open, engaged and receptive to technological citizenship.
This talk will discuss some of the successes resulting from this Open Smart Cities work, which might also be called a form or engaged scholarship. For example the language for the call for tender of the Infrastructure Canada Smart City Challenge was modified to include as a requisite that engagement and openness be part of the submissions from communities. Also, those involved with the guide have been writing policy articles that critique either AI or the smart city while also offering examples of what is possible. These articles are being read by proponents of Sidewalk Labs in Toronto. Also, the global Open Data Conference held in Argentina in September of 2018 hosted a full workshop on Open Smart Cities and finally Open North is working toward developing key performance indicators to assess those shortlisted by Infrastructure Canada and to help those communities develop an Open Smart Cities submission. The objective of the talk is to demonstrate that it is actually possible to shift public policy on large infrastructure projects, at least, in the short term.
The document discusses Hildebrandt's taxonomy of code-driven law, data-driven law, and text-driven law. It notes that code-driven law involves legal norms articulated in computer code, while data-driven law uses automatic decision-making derived from statistical/inductive methods. Text-driven law refers to legal activity performed by humans using sources like statutes and case law. The document also examines various issues that can arise with computational approaches to law, such as whether training data adequately represents the problem domain and whether all relevant facts and rules are considered. It acknowledges limitations in fully formalizing concepts like justice and cautions that computational ideals assume problems may be solved perfectly.
The webinar explores some of the current opportunities for AI within Life Science and look ahead to what we can expect to see over the coming years. These are the accompanying slides.
Rao Mikkilineni discusses the emergence of cognitive computing models and a new cognitive infrastructure. He argues that increasing data volumes and the need for real-time insights are driving the need for intelligent, sentient, and resilient systems. The new cognitive infrastructure will include a cognitive and infrastructure agnostic control overlay, composable services, and cognitive deep learning integration. It will enable a post-hypervisor cognitive computing era with intelligent, distributed systems.
Current trends in cognitive science and brain computing research 18th june 2020Dr G R Sinha
Medical Image Processing is study of acquisition, processing and analysis of various types of medical image modalities. Biomedical Imaging is one such modalities that mainly includes EEG, EMG, fMRI, MEG signals and their analysis for numerous applications such as diagnosis of mental disorder, sleep analysis, cognitive ability, study of memory and attention. Cognitive Science Research exploits biomedical modalities related to human brain and make use of the images in decoding brain commands and understanding them. This is very important in brain computer interface (BCI) and assessment of cognitive abilities. The abilities of human brain with the help of EEG signals can be described, decoded and used in performing desired tasks in numerous applications like robotics, driverless cars etc. EEG records brain activities especially electrical activities which are actually due to psychological, physiological and other changes in human brain. This lecture highlights an overview of cognitive science and brain computing research with its challenges and opportunities.
This document discusses challenges and opportunities around working with real-world data. It notes that while data is plentiful, real-world data is difficult to obtain due to issues like data silos and privacy concerns. It also discusses problems with data interoperability, quality, reliability, and needing more than just analytics to gain insights. The document advocates for linked open data streams with metadata and scalable analytics tools combined with domain knowledge to create actionable knowledge from real-world data. It concludes by listing challenges and opportunities in providing infrastructure, publishing and analyzing heterogeneous and private data at scale.
Matthew Kitching is a data scientist with over 15 years of experience in artificial intelligence, machine learning, and data science. He holds a Ph.D. in Computer Science from the University of Toronto specializing in artificial intelligence. He has worked as a data scientist at Bell Canada and Apption, developing predictive models and data strategies. He has extensive experience in Python, R, Spark, and Hadoop.
6G networking and connectivity promises significant improvements over 5G through innovative architectures and technologies. 6G aims to enable near-instant, unlimited wireless connectivity to support novel applications like telepresence, autonomous vehicles, and bio-IoT. It envisions integrating space, air, and maritime communications with terrestrial networks. 6G is expected to expand spectrum usage to low THz and visible light bands and employ technologies like nanonetworking, bionetworking, optical networking, and 3D networking. Major research challenges for 6G include developing low-power circuits for new spectrum ranges, seamless integration of multiple technologies, and addressing security and privacy issues in distributed networks.
Towards a Smart (City) Data Science. A case-based retrospective on policies, ...Enrico Daga
This document summarizes a presentation by Dr. Enrico Daga on smart city data science. It discusses several projects and initiatives in Milton Keynes, UK related to building infrastructure for smart city applications and research. This includes a data hub cataloguing city datasets, tools to support data science education and pilots with local businesses. It also covers work on privacy-aware systems, policy propagation in data flows, and using city data to power simulations and games.
Data ethics and machine learning: discrimination, algorithmic bias, and how t...Data Driven Innovation
Machine learning and data mining algorithms construct predictive models and decision making systems based on big data. Big data are the digital traces of human activities - opinions, preferences, movements, lifestyles, ... - hence they reflect all human biases and prejudices. Therefore, the models learnt from big data may inherit all such biases, leading to discriminatory decisions. In my talk, I discuss many real examples, from crime prediction to credit scoring to image recognition, and how we can tackle the problem of discovering discrimination using the very same approach: data mining.
The 4th paradigm of research is manifest in the rising popularity of data science. Data science developments relevant to human genetics are discussed with particular reference to cloud computing and data accessibility.
American Society for Human Genetics, October 16, 2018, San Diego
Europe needs a clear strategy for leveraging Big Data
Economy in Europe. Our objectives are work at technical, business and policy levels, shaping the future through the positioning of Big Data in Horizon 2020. Bringing the necessary stakeholders into a sustainable industry-led initiative, which will greatly contribute to enhance the EU competitiveness taking full advantage of Big Data technologies.
The document provides an overview of an event on emerging trends in data science given by Dr. Joanne Luciano. It discusses the data science workflow and various processes involved. Some key trends highlighted include increased use of AI and machine learning in data management and reporting, growth of natural language processing, advances in deep learning, emphasis on data privacy and ethics. The document also promotes the new minor in data science offered at University of the Virgin Islands, covering required courses and examples of course sequences for different disciplines.
Machine Learning and AI: An Intuitive Introduction - CFA Institute MasterclassQuantUniversity
Learn how artificial intelligence (AI) and machine learning are revolutionizing financial services — this course will introduce key concepts and illustrate the role of machine learning, data science techniques, and AI through examples and case studies from the investment industry. The presentation uses simple mathematics and basic statistics to provide an intuitive understanding of machine learning, as used by financial firms, to augment traditional investment decision making.
This overview session offers a tour of machine learning and AI methods, examining case studies to understand the technology companies, data vendors, banks, and fintech startups that are the key players in trading and investment management. Practical examples and case studies will help participants understand key machine learning methodologies, choose an algorithm for a specific goal, and recognize when to use machine learning and AI techniques
This document provides an overview of big data in Catalonia. It defines big data as large volumes of complex data that require new technologies to extract value. Big data is characterized by its volume, variety, and velocity. The document discusses open data and data ethics. It also reviews the global big data market and how big data is important for different industries. Finally, it examines big data applications, the big data ecosystem in Catalonia, business cases, and the role of big data in addressing COVID-19.
This document provides an overview of big data in Catalonia. It defines big data as large volumes of data that require new technologies to process and extract value due to issues with volume, variety, and velocity. It discusses related concepts like open data, data ethics, artificial intelligence, machine learning, and more. The document also examines the global big data market and applications across different industry sectors and UN Sustainable Development Goals.
Big Data Applications & Analytics Motivation: Big Data and the Cloud; Centerp...Geoffrey Fox
Motivating Introduction to MOOC on Big Data from an applications point of view https://bigdatacoursespring2014.appspot.com/course
Course says:
Geoffrey motivates the study of X-informatics by describing data science and clouds. He starts with striking examples of the data deluge with examples from research, business and the consumer. The growing number of jobs in data science is highlighted. He describes industry trend in both clouds and big data.
He introduces the cloud computing model developed at amazing speed by industry. The 4 paradigms of scientific research are described with growing importance of data oriented version. He covers 3 major X-informatics areas: Physics, e-Commerce and Web Search followed by a broad discussion of cloud applications. Parallel computing in general and particular features of MapReduce are described. He comments on a data science education and the benefits of using MOOC's.
Big Data Applications & Analytics Motivation: Big Data and the Cloud; Center...Geoffrey Fox
Motivating Introduction to MOOC on Big Data from an applications point of view https://bigdatacoursespring2014.appspot.com/course
Course says:
Geoffrey motivates the study of X-informatics by describing data science and clouds. He starts with striking examples of the data deluge with examples from research, business and the consumer. The growing number of jobs in data science is highlighted. He describes industry trend in both clouds and big data.
He introduces the cloud computing model developed at amazing speed by industry. The 4 paradigms of scientific research are described with growing importance of data oriented version. He covers 3 major X-informatics areas: Physics, e-Commerce and Web Search followed by a broad discussion of cloud applications. Parallel computing in general and particular features of MapReduce are described. He comments on a data science education and the benefits of using MOOC's.
This document summarizes two real-world use cases for digital preservation: embedded systems development and financial engineering. For the embedded systems case, the document describes research on developing an autonomous underwater vehicle and identifies relevant data formats, software tools, and hardware. For the financial engineering case, it outlines research calibrating an inflation model using treasury bond data from sources like Bloomberg and analyzing it in MATLAB. Both cases could benefit from emulation to help preserve workflows and access data over the long term.
Modern signal processing is dead without machine learning! 5th july 2020Dr G R Sinha
This lecture highlights role of Machine Learning in Modern Signal Processing Applications such as Driver-less Cars, Robotics, Smart Environment Monitoring, Healthcare etc.
From Aspiration to Reality: Open Smart Cities
Open smart cities might become a reality for Canada. Globally there are a number of initiatives, programs, and practices that are open smart city like which means that it is possible to have an open, responsive and engaged city that is both socio-technologically enabled, but also one where there is receptivity to and a willingness to grow a critically informed type of technological citizenship (Feenberg). For an open smart city to exist, public officials, the private sector, scholars, civil society and residents and citizens require a definition and a guide to start the exercise of imagining what an open smart city might look like. There is much critical scholarship about the smart city and there are many counter smart city narratives, but there are few depictions of what engagement, participatory design and technological leadership might be. The few examples that do exist are project based and few are systemic. An open smart city definition and guide was therefore created by a group of stakeholders in such a way that it can be used as the basis for the design of an open smart city from the ground up, or to help actors shape or steer the course of emerging or ongoing data and networked urbanist forms (Kitchin) of smart cities to lead them towards being open, engaged and receptive to technological citizenship.
This talk will discuss some of the successes resulting from this Open Smart Cities work, which might also be called a form or engaged scholarship. For example the language for the call for tender of the Infrastructure Canada Smart City Challenge was modified to include as a requisite that engagement and openness be part of the submissions from communities. Also, those involved with the guide have been writing policy articles that critique either AI or the smart city while also offering examples of what is possible. These articles are being read by proponents of Sidewalk Labs in Toronto. Also, the global Open Data Conference held in Argentina in September of 2018 hosted a full workshop on Open Smart Cities and finally Open North is working toward developing key performance indicators to assess those shortlisted by Infrastructure Canada and to help those communities develop an Open Smart Cities submission. The objective of the talk is to demonstrate that it is actually possible to shift public policy on large infrastructure projects, at least, in the short term.
The document discusses Hildebrandt's taxonomy of code-driven law, data-driven law, and text-driven law. It notes that code-driven law involves legal norms articulated in computer code, while data-driven law uses automatic decision-making derived from statistical/inductive methods. Text-driven law refers to legal activity performed by humans using sources like statutes and case law. The document also examines various issues that can arise with computational approaches to law, such as whether training data adequately represents the problem domain and whether all relevant facts and rules are considered. It acknowledges limitations in fully formalizing concepts like justice and cautions that computational ideals assume problems may be solved perfectly.
The webinar explores some of the current opportunities for AI within Life Science and look ahead to what we can expect to see over the coming years. These are the accompanying slides.
Rao Mikkilineni discusses the emergence of cognitive computing models and a new cognitive infrastructure. He argues that increasing data volumes and the need for real-time insights are driving the need for intelligent, sentient, and resilient systems. The new cognitive infrastructure will include a cognitive and infrastructure agnostic control overlay, composable services, and cognitive deep learning integration. It will enable a post-hypervisor cognitive computing era with intelligent, distributed systems.
Current trends in cognitive science and brain computing research 18th june 2020Dr G R Sinha
Medical Image Processing is study of acquisition, processing and analysis of various types of medical image modalities. Biomedical Imaging is one such modalities that mainly includes EEG, EMG, fMRI, MEG signals and their analysis for numerous applications such as diagnosis of mental disorder, sleep analysis, cognitive ability, study of memory and attention. Cognitive Science Research exploits biomedical modalities related to human brain and make use of the images in decoding brain commands and understanding them. This is very important in brain computer interface (BCI) and assessment of cognitive abilities. The abilities of human brain with the help of EEG signals can be described, decoded and used in performing desired tasks in numerous applications like robotics, driverless cars etc. EEG records brain activities especially electrical activities which are actually due to psychological, physiological and other changes in human brain. This lecture highlights an overview of cognitive science and brain computing research with its challenges and opportunities.
This document discusses challenges and opportunities around working with real-world data. It notes that while data is plentiful, real-world data is difficult to obtain due to issues like data silos and privacy concerns. It also discusses problems with data interoperability, quality, reliability, and needing more than just analytics to gain insights. The document advocates for linked open data streams with metadata and scalable analytics tools combined with domain knowledge to create actionable knowledge from real-world data. It concludes by listing challenges and opportunities in providing infrastructure, publishing and analyzing heterogeneous and private data at scale.
Matthew Kitching is a data scientist with over 15 years of experience in artificial intelligence, machine learning, and data science. He holds a Ph.D. in Computer Science from the University of Toronto specializing in artificial intelligence. He has worked as a data scientist at Bell Canada and Apption, developing predictive models and data strategies. He has extensive experience in Python, R, Spark, and Hadoop.
6G networking and connectivity promises significant improvements over 5G through innovative architectures and technologies. 6G aims to enable near-instant, unlimited wireless connectivity to support novel applications like telepresence, autonomous vehicles, and bio-IoT. It envisions integrating space, air, and maritime communications with terrestrial networks. 6G is expected to expand spectrum usage to low THz and visible light bands and employ technologies like nanonetworking, bionetworking, optical networking, and 3D networking. Major research challenges for 6G include developing low-power circuits for new spectrum ranges, seamless integration of multiple technologies, and addressing security and privacy issues in distributed networks.
Towards a Smart (City) Data Science. A case-based retrospective on policies, ...Enrico Daga
This document summarizes a presentation by Dr. Enrico Daga on smart city data science. It discusses several projects and initiatives in Milton Keynes, UK related to building infrastructure for smart city applications and research. This includes a data hub cataloguing city datasets, tools to support data science education and pilots with local businesses. It also covers work on privacy-aware systems, policy propagation in data flows, and using city data to power simulations and games.
Data ethics and machine learning: discrimination, algorithmic bias, and how t...Data Driven Innovation
Machine learning and data mining algorithms construct predictive models and decision making systems based on big data. Big data are the digital traces of human activities - opinions, preferences, movements, lifestyles, ... - hence they reflect all human biases and prejudices. Therefore, the models learnt from big data may inherit all such biases, leading to discriminatory decisions. In my talk, I discuss many real examples, from crime prediction to credit scoring to image recognition, and how we can tackle the problem of discovering discrimination using the very same approach: data mining.
The 4th paradigm of research is manifest in the rising popularity of data science. Data science developments relevant to human genetics are discussed with particular reference to cloud computing and data accessibility.
American Society for Human Genetics, October 16, 2018, San Diego
Europe needs a clear strategy for leveraging Big Data
Economy in Europe. Our objectives are work at technical, business and policy levels, shaping the future through the positioning of Big Data in Horizon 2020. Bringing the necessary stakeholders into a sustainable industry-led initiative, which will greatly contribute to enhance the EU competitiveness taking full advantage of Big Data technologies.
The document provides an overview of an event on emerging trends in data science given by Dr. Joanne Luciano. It discusses the data science workflow and various processes involved. Some key trends highlighted include increased use of AI and machine learning in data management and reporting, growth of natural language processing, advances in deep learning, emphasis on data privacy and ethics. The document also promotes the new minor in data science offered at University of the Virgin Islands, covering required courses and examples of course sequences for different disciplines.
Machine Learning and AI: An Intuitive Introduction - CFA Institute MasterclassQuantUniversity
Learn how artificial intelligence (AI) and machine learning are revolutionizing financial services — this course will introduce key concepts and illustrate the role of machine learning, data science techniques, and AI through examples and case studies from the investment industry. The presentation uses simple mathematics and basic statistics to provide an intuitive understanding of machine learning, as used by financial firms, to augment traditional investment decision making.
This overview session offers a tour of machine learning and AI methods, examining case studies to understand the technology companies, data vendors, banks, and fintech startups that are the key players in trading and investment management. Practical examples and case studies will help participants understand key machine learning methodologies, choose an algorithm for a specific goal, and recognize when to use machine learning and AI techniques
This document provides an overview of big data in Catalonia. It defines big data as large volumes of complex data that require new technologies to extract value. Big data is characterized by its volume, variety, and velocity. The document discusses open data and data ethics. It also reviews the global big data market and how big data is important for different industries. Finally, it examines big data applications, the big data ecosystem in Catalonia, business cases, and the role of big data in addressing COVID-19.
This document provides an overview of big data in Catalonia. It defines big data as large volumes of data that require new technologies to process and extract value due to issues with volume, variety, and velocity. It discusses related concepts like open data, data ethics, artificial intelligence, machine learning, and more. The document also examines the global big data market and applications across different industry sectors and UN Sustainable Development Goals.
Big Data Applications & Analytics Motivation: Big Data and the Cloud; Centerp...Geoffrey Fox
Motivating Introduction to MOOC on Big Data from an applications point of view https://bigdatacoursespring2014.appspot.com/course
Course says:
Geoffrey motivates the study of X-informatics by describing data science and clouds. He starts with striking examples of the data deluge with examples from research, business and the consumer. The growing number of jobs in data science is highlighted. He describes industry trend in both clouds and big data.
He introduces the cloud computing model developed at amazing speed by industry. The 4 paradigms of scientific research are described with growing importance of data oriented version. He covers 3 major X-informatics areas: Physics, e-Commerce and Web Search followed by a broad discussion of cloud applications. Parallel computing in general and particular features of MapReduce are described. He comments on a data science education and the benefits of using MOOC's.
Big Data Applications & Analytics Motivation: Big Data and the Cloud; Center...Geoffrey Fox
Motivating Introduction to MOOC on Big Data from an applications point of view https://bigdatacoursespring2014.appspot.com/course
Course says:
Geoffrey motivates the study of X-informatics by describing data science and clouds. He starts with striking examples of the data deluge with examples from research, business and the consumer. The growing number of jobs in data science is highlighted. He describes industry trend in both clouds and big data.
He introduces the cloud computing model developed at amazing speed by industry. The 4 paradigms of scientific research are described with growing importance of data oriented version. He covers 3 major X-informatics areas: Physics, e-Commerce and Web Search followed by a broad discussion of cloud applications. Parallel computing in general and particular features of MapReduce are described. He comments on a data science education and the benefits of using MOOC's.
Digital Transformation:
Business process re-engineering with digital technologies
Technology used to make existing work more efficient, now technology is transforming the work itself
Example: single shared item lookup process in blockchain supply chain
Productivity gains
Capital investment in technology
Data centers
Blockchain as a Service, Deep Learning nets
Skilled work force development
Train 1000 software developers
Hyperledger, Ethereum, Corda
Machine Learning, AI, Deep Learning
Scale efficiencies
Natural resources, regional strength, large companies
Manage global trade supply chain with blockchain/deep learning
Story of Bigdata and its Applications in Financial Institutionsijtsrd
The importance of BigData is indeed nothing new, but being able to manage data efficiently is just now becoming more attainable. Although data management has evolved considerably since the 1800's, advancements made in recent years that have made the process even more efficient. Technique of Data mining, is much used in the banking industry, which helps banks compete in the market and provide the right product to the right customer. While collecting and combining different sources of data into a single significant volumetric Golden Source of TRUTH can be achieved by applying the right combination of tools. In this paper Author introduced BIGDATA technologies in brief along with its applications. Phani Bhooshan | Dr. C. Umashankar "Story of Bigdata and its Applications in Financial Institutions" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-6 , October 2019, URL: https://www.ijtsrd.com/papers/ijtsrd29145.pdf Paper URL: https://www.ijtsrd.com/computer-science/database/29145/story-of-bigdata-and-its-applications-in-financial-institutions/phani-bhooshan
Data engineering is the practice of designing and building systems for collecting, storing, and analyzing data at scale. It allows organizations to collect massive amounts of data and ensure the data is highly usable by data scientists and analysts. As data volumes continue to grow exponentially, data engineers are needed to process and channel data to enable fields like machine learning and deep learning.
This document proposes a theme on big data analytics research. It motivates the importance of big data due to the exponential growth of digital data and limitations of traditional databases. The power of big data analytics is discussed through its wide applications in health, policymaking, smart cities, education and robotics. The objectives are outlined as large-scale machine learning, distributed computing, theory development, and multi-disciplinary analytics. Hong Kong is well positioned for this research due to its institutions, industries and potential collaborators. A multi-university and interdisciplinary approach is advocated to tackle big data challenges and transform society through new technologies, applications, insights and knowledge.
The document provides an introduction to big data, including:
1) It defines big data and discusses its key characteristics of volume, velocity, and variety.
2) It describes sources of big data like sensors, social media, and purchase transactions.
3) It discusses big data analytics including descriptive, predictive, and prescriptive analytics and the stages of capture, organize, analyze, and act.
The document discusses fairness, accountability, and transparency (FAT) in recommender systems. It begins with an introduction of the presenter, Denis Parra, who is an associate professor in Chile studying recommender systems. The presentation then discusses some examples of recent advances in artificial intelligence like natural language processing, self-driving cars, and mastering the game of Go. However, it also notes there are some problems with bias in AI systems that affect areas like criminal risk assessments and facial analysis. The presentation suggests recommender systems can also be affected by these issues and discusses ways researchers are working to address fairness, explainability, and transparency in machine learning models and applications like recommender systems.
An introductory take on the ethical issues surrounding the use of algorithms and machine learning in finance, education, law enforcement and defense. This work was stimulated by, but is not a product or authorized content from the IEEE P7003 WG.
Disclaimer: This work is mine alone and does not reflect view of IEEE, IEEE 7003 WG, my employer.
Similar to Human-Centered Machine Learning: Harnessing Visualization and Interactivity for Unlocking Black-Boxes in AI (20)
The Effect of Explanations & Algorithmic Accuracy on Visual Recommender Syste...Denis Parra Santander
My presentation at ACM Conference on Intelligent User Interfaces (IUI 2019) "The Effect of Explanations & Algorithmic Accuracy on Visual Recommender Systems of Artistic Images"
Do Better ImageNet Models Transfer Better... for Image Recommendation?Denis Parra Santander
1) The document presents research on using different pre-trained deep learning models from ImageNet for the task of artwork recommendation.
2) An experiment showed that the performance of models on ImageNet did not correlate with their performance on the artwork recommendation task.
3) Fine-tuning the models on the artwork data improved performance over using the pre-trained models directly, with deep fine-tuning working better than shallow fine-tuning. Fine-tuning even on a small dataset was beneficial.
Interactive Recommender Systems: Bridging the gap between predictive algorithms and interactive user interfaces.
Invited talk at UFMG, Brasil. March 2017.
More on this topic:
Chen He, Denis Parra, and Katrien Verbert. 2016. Interactive recommender systems. Expert Syst. Appl. 56, C (September 2016), 9-27. DOI=http://dx.doi.org/10.1016/j.eswa.2016.02.013
This was my final project back in 2009, in the class of Natural Language Processing at the CS department in University of Pittsburgh, PA, USA, class taught by professor Rebecca Hwa.
It has many details on the backup slides about LDA, hyperparameters, how to calculate the distributions based on MLE, etc.
Keynote at Chilean Week of Computer Science. I present a brief overview of algorithms for Recommender and then I present my work Tag-based Recommendation, Implicit Feedback and Visual Interactive Interfaces.
The Effect of Different Set-based Visualizations on User Exploration of Reco...Denis Parra Santander
The document summarizes two studies that explored different set-based visualizations for helping users explore recommendations. The studies compared TalkExplorer, a graph-based recommender, and SetFusion, which used an interactive Venn diagram. Results showed that visualizing intersections of relevant contexts helped users discover more relevant items. SetFusion may have better supported exploration of multiple intersections through its Venn diagram interface. Future work could explore scaling SetFusion to more data sources and recommendation algorithms.
Twitter in Academic Conferences:Usage, Networking and Participation over Time
ACM Conference on Hypertext and Social Media 2014
----
http://dl.acm.org/citation.cfm?doid=2631775.2631826
http://dx.doi.org/10.1145/2631775.2631826
----
Xidao Wen, University of Pittsburgh
Yu-Ru Lin, University of Pittsburgh
Christoph Trattner, Know-Center
Denis Parra, Pontificia Universidad Católica de Chile
Slides of my presentation at IUI 2014, the visual Hybrid Recommender SetFusion - "See What you Want to See: Visual User-Driven Approach for Recommendation"
http://dl.acm.org/citation.cfm?id=2557542
DEMO available:
http://www.youtube.com/watch?v=9LwSx1V6Yxk
Walk the Talk: Analyzing the relation between implicit and explicit feedback ...Denis Parra Santander
The document describes a study that analyzed the relationship between implicit and explicit feedback for preference elicitation. The researchers conducted a survey of Last.fm users, collecting demographic data and music listening habits. They also had users rate 100 albums from their listening history. Regression analysis found that implicit feedback (play counts) and recency of listening could predict ratings, but global popularity did not significantly improve predictions. Ongoing work includes incorporating the nested nature of ratings and using alternative evaluation metrics beyond RMSE.
Network Visualization guest lecture at #DataVizQMSS at @Columbia / #SNA at PU...Denis Parra Santander
- First version was a guest lecture about Network Visualization in the class "Data Visualization" taught by Dr. Sharon Hsiao in the QMSS program at Columbia University http://www.columbia.edu/~ih2240/dataviz/index.htm
- This updated version was delivered in our class on SNA at PUC Chile in the MPGI master program.
* Short introduction to myself (where i am from, which are my hobbies)
* Presenting my research activities in the latest 2 years, with a more detailed presentation of the last paper I wrote with Xavier Amatriain, to be presented at UMAP 2011
Evaluation of Collaborative Filtering Algorithms for Recommending Articles on...Denis Parra Santander
Presentation given at the Workshop "Web 3.0: Merging Semantic Web and Social Web" in the conference Hypertext 2009, Torino, Italy.
The workshop online proceedings are here:http://ftp1.de.freebsd.org/Publications/CEUR-WS/Vol-467/
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfChart Kalyan
A Mix Chart displays historical data of numbers in a graphical or tabular form. The Kalyan Rajdhani Mix Chart specifically shows the results of a sequence of numbers over different periods.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on automated letter generation for Bonterra Impact Management using Google Workspace or Microsoft 365.
Interested in deploying letter generation automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Dive into the realm of operating systems (OS) with Pravash Chandra Das, a seasoned Digital Forensic Analyst, as your guide. 🚀 This comprehensive presentation illuminates the core concepts, types, and evolution of OS, essential for understanding modern computing landscapes.
Beginning with the foundational definition, Das clarifies the pivotal role of OS as system software orchestrating hardware resources, software applications, and user interactions. Through succinct descriptions, he delineates the diverse types of OS, from single-user, single-task environments like early MS-DOS iterations, to multi-user, multi-tasking systems exemplified by modern Linux distributions.
Crucial components like the kernel and shell are dissected, highlighting their indispensable functions in resource management and user interface interaction. Das elucidates how the kernel acts as the central nervous system, orchestrating process scheduling, memory allocation, and device management. Meanwhile, the shell serves as the gateway for user commands, bridging the gap between human input and machine execution. 💻
The narrative then shifts to a captivating exploration of prominent desktop OSs, Windows, macOS, and Linux. Windows, with its globally ubiquitous presence and user-friendly interface, emerges as a cornerstone in personal computing history. macOS, lauded for its sleek design and seamless integration with Apple's ecosystem, stands as a beacon of stability and creativity. Linux, an open-source marvel, offers unparalleled flexibility and security, revolutionizing the computing landscape. 🖥️
Moving to the realm of mobile devices, Das unravels the dominance of Android and iOS. Android's open-source ethos fosters a vibrant ecosystem of customization and innovation, while iOS boasts a seamless user experience and robust security infrastructure. Meanwhile, discontinued platforms like Symbian and Palm OS evoke nostalgia for their pioneering roles in the smartphone revolution.
The journey concludes with a reflection on the ever-evolving landscape of OS, underscored by the emergence of real-time operating systems (RTOS) and the persistent quest for innovation and efficiency. As technology continues to shape our world, understanding the foundations and evolution of operating systems remains paramount. Join Pravash Chandra Das on this illuminating journey through the heart of computing. 🌟
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
leewayhertz.com-AI in predictive maintenance Use cases technologies benefits ...alexjohnson7307
Predictive maintenance is a proactive approach that anticipates equipment failures before they happen. At the forefront of this innovative strategy is Artificial Intelligence (AI), which brings unprecedented precision and efficiency. AI in predictive maintenance is transforming industries by reducing downtime, minimizing costs, and enhancing productivity.
leewayhertz.com-AI in predictive maintenance Use cases technologies benefits ...
Human-Centered Machine Learning: Harnessing Visualization and Interactivity for Unlocking Black-Boxes in AI
1. Human-Centered Machine Learning:
Harnessing Visualization and Interactivity
for Unlocking Black-Boxes in AI
DenisParra,AssistantProfessor
CS Department
Schoolof Engineering
PontificiaUniversidadCatólicadeChile
IMT,30 de Mayode2018
2. Presents …
• Assistant Professor DCC PUC
• Teaching
– Undergraduate:
• IIC 1005 Exploratorio del Major de Computación
• IIC 2026 Visualización de Información
– Graduate:
• IIC 3633 Sistemas Recomendadores
– Magister MPGI(Mineríade Datos),DiplomadoBig Data (Visualización)
• Research: SocVis Lab (http://socvis.ing.puc.cl)
– Machine Learningapplications (RecSys),informationvisualization,
informationretrieval,visual analytics
– 5 Master and 2 PhD students
– 3 undergraduatestudents
D.Parra ~ IMT PUCChile6/7/18 1
3. We are living incredible days…
• Technology is showing results which resemble
science fiction…
6/7/18 D.Parra ~ IMT PUCChile 2
5. Natural Language Processing
• IBM Watson beats humans in Jeopardy. << ... With
all of its processing CPU power, Watson can scan
two million pages of data in three seconds.>>E.
Nyberg, CMU professor
• Implications: Applications in Health domain.
http://www.aaai.org/Magazine/Watson/watson.php
Chile shares its largest frontier with this country …
6/7/18 D.Parra ~ IMT PUCChile 4
7. Music Generation with Style
• Deep learningdriven jazz generation
• https://github.com/jisungk/deepjazz
• https://soundcloud.com/deepjazz-ai
6/7/18 D.Parra ~ IMT PUCChile 6
9. AI for incorporating style in images
https://github.com/luanfujun
/deep-painterly-
harmonization
6/7/18 D.Parra ~ IMT PUCChile 8
10. But there are some problems
6/7/18 D.Parra ~ IMT PUCChile 9
11. AI for automatic decision making…
6/7/18 D.Parra ~ IMT PUCChile 10
https://www.fastcompany.com/40557688/this-plan-for-an-ai-based-direct-democracy-
outsources-votes-to-a-predictive-algorithm
12. Some voices call for calm…
6/7/18 D.Parra ~ IMT PUCChile 11
https://medium.com/@mijordan3/artificial-intelligence-the-revolution-hasnt-happened-yet-
5e1d5812e1e7
Thus, just as humans built buildings and bridges
before there was civil engineering, humans are
proceeding with the buildingof societal-scale,
inference-and-decision-making systems that
involve machines, humans and the environment.
Just as early buildings and bridges sometimes fell
to the ground — in unforeseenways and with
tragic consequences — many of our early societal-
scale inference-and-decision-making systems are
already exposing serious conceptual flaws.
13. Part II
• What happened in May 25th 2018
• Which role can take Information Visualization and
Interaction in AI ?
D.Parra ~ IMT PUCChile6/7/18 12
14. So, What happened in May 25th, 2018 ?
• The EU General Data Protection Regulation
(GDPR) becomes enforceable.
D.Parra ~ IMT PUCChile6/7/18 13
15. And why do we care in this room ?
• The GDPR not only applies to organisations
located within the EU but it will also apply to
organisations located outside of the EU if they
offer goods or services to, or monitor the behaviour
of, EU data subjects.
• It applies to all companies processing and
holding the personal data of data subjects
residing in the European Union, regardless of the
company’s location.
D.Parra ~ IMT PUCChile6/7/18 14
17. Which is the effect on my current practice ?
Right to explanation
• Article 15 “Right of access by the data subject”
• Article 22 “Automated individual decision-
making, including profiling”
• Recital 71 (linked to art. 22)
D.Parra ~ IMT PUCChile6/7/18 16
19. Article 15
D.Parra ~ IMT PUCChile
<<The data subject shall have the right to
obtain … access to the personal
data … [information of] the existence of
automated decision-making, including
profiling, referred to in Article 22(1) and (4)
and, at least in those cases, meaningful
information about the logic involved … >>
6/7/18 18
21. Article 22
D.Parra ~ IMT PUCChile
I
<<The data subject shall have the right not to
be subject to a decision based solely on
automated processing, including profiling,
which produces legal effects concerning him
or her or similarly significantly affects him or
her.… >>
6/7/18 20
22. Article 22
D.Parra ~ IMT PUCChile
II
<<Decisions referred to in paragraph 2 shall
not be based on special categories of personal
data referred to in Article 9(1), >>
Processing of personal data revealing racial
or ethnic origin, political opinions,
religious or philosophical beliefs … (cont.)
6/7/18 21
23. Article 22
D.Parra ~ IMT PUCChile
II
…or trade union membership, and the
processing of genetic data, biometric data for
the purpose of uniquely identifying a natural
person, data concerning health or data
concerning a natural person's sex life or sexual
orientation shall be prohibited
6/7/18 22
25. Recital 71
D.Parra ~ IMT PUCChile
In order to ensure fair and transparent processing
in respect of the data subject, taking into account
the specific circumstances and context in which
the personal data are processed, the controller
should use appropriate mathematical or statistical
procedures for the profiling …
6/7/18 24
26. Other Initiatives (2018)
6/7/18 D.Parra ~ IMT PUCChile 25
This bill would require the creation of a
task force that provides
recommendations on how information
on agency automated decision systems
may be shared with the public and how
agencies may address instances where
people are harmed by agency
automated decision systems.
27. ML and GDPR
• How do we explain Machine Learning models?
• From Decision Trees to Deep Neural Networks
D.Parra ~ IMT PUCChile
Explainable decision model, explicit
variables, not very accurate
Black-box decision model,
latent variables, accurate
6/7/18 26
28. Challenges
• Are there methods to explain models?
– Can we create methods to help us explain decisions
made by complex models such as DNN ?
• Can visualization & interaction help in this issue?
D.Parra ~ IMT PUCChile6/7/18 27
32. IEEE 2017 VAST Best Paper
• Visualizing Dataflow Graphs of Deep Learning
Models in TensorFlow
Authors: Kanit Wongsuphasawat, Daniel Smilkov, James Wexler, Jimbo Wilson,
Dandelion Mané, Doug Fritz, Dilip Krishnan, Fernanda B. Viégas, and Martin
Wattenberg
D.Parra ~ IMT PUCChile6/7/18 31
33. How do RNNs work?
D.Parra ~ IMT PUCChile6/7/18 32
42. https://fatconference.org/
• The FAT* Conference 2018 is a two-day event that
brings together researchers and practitioners
interested in fairness, accountability, and
transparency in socio-technical systems.
6/7/18 D.Parra ~ IMT PUCChile 41
46. Topic Models
• Very popular models to analyze collections of text.
• They allow to summarize large corpuses by finding
topics in unsupervised fashion.
6/7/18 D.Parra ~ IMT PUCChile 45
47. Topic Models
• Very popular models to analyze collections of text.
• They allow to summarize large corpuses by finding
topics in unsupervised fashion.
6/7/18 D.Parra ~ IMT PUCChile 46
But they also:
- Report topics with little sense,
- redundants, or
- Where vocabulary with unbiased importance to frequent words
Does not work well with short texts
52. Research (M.F. Sepúlveda)
• Development of interfaces with interaction which
allows people to deal with unsupervised models,
using a Human-in-the-loop for semi-supervision.
6/7/18 D.Parra ~ IMT PUCChile 51
53. IEEE VIS 2017 – Panel ML & Vis ?
D.Parra ~ IMT PUCChile6/7/18 52
54. IEEE VIS 2017 – Panel ML & Vis ?
D.Parra ~ IMT PUCChile6/7/18 53
55. Conclusion
• Deep Neural Networks have produced tremendous
progresses in several fields in the latest years (computer
vision, NLP, recommender systems, etc.)
• Technology is not innocuous: depending on the data
used and the characteristics of the methods, can have
several implications in our societies, not always
positive.
• Foreseeing the impact of ML and AI, regulations are
requiring transparency, explainability, inspectability
and other characteristics which need further research.
D.Parra ~ IMT PUCChile6/7/18 54
57. Research Opportunities
• Effect of ML models on Social Issues
– Filter Bubble/Echo Chambers
– Does Facebook influenced the results of presidential
elections ?
D.Parra ~ IMT PUCChile
Facebook newsfeed recommendation model
6/7/18 56
58. Research Opportunities
• New Millenium Institute for Foundational Research
on Data (IMFD)
– Lidera Profesor Marcelo Arenas, DCC PUC Chile
– Investigadores de PUC, UChile, USM, UdeC
– Áreas: Computación, Estadística, Matemáticas,
Sociología, Ciencia Política, Comunicaciones
(Periodismo)
D.Parra ~ IMT PUCChile6/7/18 57
59. References
• Goodman, B., & Flaxman, S. (2016). EU regulations
on algorithmic decision-making and a “right to
explanation”. In ICML workshop on human
interpretability in machine learning (WHI 2016),
New York, NY. http://arxiv. org/abs/1606.08813
v1.
• Edwards, L., & Veale, M. (2017). Slave to the
Algorithm? Why a ‘Right to Explanation’ is
Probably Not the Remedy You are Looking for.
Duke Law & Technology Review.
D.Parra ~ IMT PUCChile6/7/18 58