Information technology fields are now more dominated by artificial intelligence, as it is playing a key role in terms of providing better services. The inherent strengths of artificial intelligence are driving the companies into a modern, decisive, secure, and insight-driven arena to address the current and future challenges. The key technologies like cloud, internet of things (IoT), and software-defined networking (SDN) are emerging as future applications and rendering benefits to the society. Integrating artificial intelligence with these innovations with scalability brings beneficiaries to the next level of efficiency. Data generated from the heterogeneous devices are received, exchanged, stored, managed, and analyzed to automate and improve the performance of the overall system and be more reliable. Although these new technologies are not free of their limitations, nevertheless, the synthesis of technologies has been challenged and has put forth many challenges in terms of scalability and reliability. Therefore, this paper discusses the role of artificial intelligence (AI) along with issues and opportunities confronting all communities for incorporating the integration of these technologies in terms of reliability and scalability. This paper puts forward the future directions related to scalability and reliability concerns during the integration of the above-mentioned technologies and enable the researchers to address the current research gaps.
Artificial intelligence is part of almost every business today; it facilitates business operations, increases productivity, and offers a variety of ways to speed up communication processes. Artificial intelligence and software (or software applications installed on it), as well as automation through AI systems, perform many of the tasks previously performed by employees and workers. Switching to an automated working environment has resulted in a lot of unnecessary business expenses, substantial time savings and a gradual increase in profits. The automation through AI of various business processes has taken many companies and organizations to the next level in terms of production and management.So, this article explains the role of artificial intelligence, machine learning and cloud computing in business. by Dr. Pawan Whig 2019. Artificial Intelligence and Machine Learning In Business. International Journal on Integrated Education. 2, 2 (Jun. 2019). https://journals.researchparks.org/index.php/IJIE/article/view/516/493 https://journals.researchparks.org/index.php/IJIE/article/view/516
Inteligent computing relating to cloud computing.finalEr. rahul abhishek
This paper contends that the real understanding of natural language and the fulfillment of cloud computing cannot be reached without dealing with the significant sentimental factor. This paper points out that the achievement and enjoyment of cloud computing is highly reliant on break throughs in advanced intelligence. In this paper, advanced intelligence refers to the high level of interaction between natural intelligence and artificial intelligence.
We introduce intelligent computing language in the software so that machines can take decisions autonomously and in real time. By applying artificial intelligence to the cloud, we are hoping to develop a system through which computers can manage themselves. For example, computer scientists are looking to develop software that follows computers’ power consumption and regulates their operation according to the specific needs at any given time, thus reducing energy expenditure.
Implanting artificial intelligence into codes that will run in the cloud to improve efficiency is one of the strong research lines. Its part of a drive to create applications, executed in the cloud that goes beyond basic automation to anticipate situations and take decisions in real time over the Internet. We introduce intelligent computing language in the software so that machines can take decisions autonomously and in real time.
Cloud computing and artificial intelligenceFurqan Haider
Hi friends, I have created this simple yet helpful slide about application of AI in cloud computing for a presentation held at my class. i hope it would help those who looking for this topic since Its a rare topic and hardly anyone else ever made ppt about this topic. thanks
Have a nice day :)
Cognitive computing big_data_statistical_analyticsPietro Leo
Cognitive computing, big data, and statistical analytics represent new frontiers for innovation that will transform organizations. These emerging technologies rely on analyzing vast amounts of structured and unstructured data using powerful computers and sophisticated algorithms to generate novel insights. Realizing their full potential will require integrating data and analytics from hundreds or thousands of diverse sources to reduce uncertainty and construct high-value context. This represents a strategic challenge for organizations to create an integrated view of information from all available data channels.
Machine learning and ai in a brave new cloud worldUlf Mattsson
Machine learning platforms are one of the fastest growing services of the public cloud. ML, an approach and set of technologies that use Artificial Intelligence (AI) concepts, is directly related to pattern recognition and computational learning. Early adopters of AI have now rolled out cloud-based services that are bringing AI to the masses.
How are AI, deep learning, machine learning, big data, and cloud related? Can machine learning algorithms enable the use of an individual’s comprehensive biological information to predict or diagnose diseases, and to find or develop the best therapy for that individual? How is Quantum Computing in the Cloud related to the use of AI and Cybersecurity?
Join this webinar to learn more about:
- Machine Learning, Data Discovery and Cloud
- Cloud-Based ML Applications and ML services from AWS and Google Cloud
- How to Automate Machine Learning
SmartData Webinar: Cognitive Computing in the Mobile App EconomyDATAVERSITY
Mobility is transforming work and life throughout the planet. Mobile apps--built for a growing range of handhelds, wearables, Internet of Things, and other platforms--are becoming the universal access paths to commerce, content, and community in the 21st century. The app economy refers to this new world where every decision, action, exploration, and experience is continuously enriched and optimized through the cloud-served apps that accompany you everywhere. In this webinar, James Kobielus, IBM's Big Data Evangelist, will discuss the potential of cognitive computing to super-power the emerging app economy. In addition to providing an overview of IBM's Watson strategy for cognitive computing, Kobielus will go in-depth on IBM's strategic partnership with Apple to draw on the strengths of each company to transform enterprise mobility through a new class of apps that leverage IBM’s Watson-based big data analytics cloud and add value to Apple's iPhone and iPad platforms in diverse industries.
https://www.learntek.org/blog/top-10-technology-trends-in-2019/
Learntek is global online training provider on Big Data Analytics, Hadoop, Machine Learning, Deep Learning, IOT, AI, Cloud Technology, DEVOPS, Digital Marketing and other IT and Management courses.
What if Things Start to Think - Artificial Intelligence in IoTMuralidhar Somisetty
Artificial intelligence will be functionally necessary to wield the vast number of connected “things” online, and will be even more important in making sense of an almost endless sea of data streamed in from these devices.
Artificial intelligence is part of almost every business today; it facilitates business operations, increases productivity, and offers a variety of ways to speed up communication processes. Artificial intelligence and software (or software applications installed on it), as well as automation through AI systems, perform many of the tasks previously performed by employees and workers. Switching to an automated working environment has resulted in a lot of unnecessary business expenses, substantial time savings and a gradual increase in profits. The automation through AI of various business processes has taken many companies and organizations to the next level in terms of production and management.So, this article explains the role of artificial intelligence, machine learning and cloud computing in business. by Dr. Pawan Whig 2019. Artificial Intelligence and Machine Learning In Business. International Journal on Integrated Education. 2, 2 (Jun. 2019). https://journals.researchparks.org/index.php/IJIE/article/view/516/493 https://journals.researchparks.org/index.php/IJIE/article/view/516
Inteligent computing relating to cloud computing.finalEr. rahul abhishek
This paper contends that the real understanding of natural language and the fulfillment of cloud computing cannot be reached without dealing with the significant sentimental factor. This paper points out that the achievement and enjoyment of cloud computing is highly reliant on break throughs in advanced intelligence. In this paper, advanced intelligence refers to the high level of interaction between natural intelligence and artificial intelligence.
We introduce intelligent computing language in the software so that machines can take decisions autonomously and in real time. By applying artificial intelligence to the cloud, we are hoping to develop a system through which computers can manage themselves. For example, computer scientists are looking to develop software that follows computers’ power consumption and regulates their operation according to the specific needs at any given time, thus reducing energy expenditure.
Implanting artificial intelligence into codes that will run in the cloud to improve efficiency is one of the strong research lines. Its part of a drive to create applications, executed in the cloud that goes beyond basic automation to anticipate situations and take decisions in real time over the Internet. We introduce intelligent computing language in the software so that machines can take decisions autonomously and in real time.
Cloud computing and artificial intelligenceFurqan Haider
Hi friends, I have created this simple yet helpful slide about application of AI in cloud computing for a presentation held at my class. i hope it would help those who looking for this topic since Its a rare topic and hardly anyone else ever made ppt about this topic. thanks
Have a nice day :)
Cognitive computing big_data_statistical_analyticsPietro Leo
Cognitive computing, big data, and statistical analytics represent new frontiers for innovation that will transform organizations. These emerging technologies rely on analyzing vast amounts of structured and unstructured data using powerful computers and sophisticated algorithms to generate novel insights. Realizing their full potential will require integrating data and analytics from hundreds or thousands of diverse sources to reduce uncertainty and construct high-value context. This represents a strategic challenge for organizations to create an integrated view of information from all available data channels.
Machine learning and ai in a brave new cloud worldUlf Mattsson
Machine learning platforms are one of the fastest growing services of the public cloud. ML, an approach and set of technologies that use Artificial Intelligence (AI) concepts, is directly related to pattern recognition and computational learning. Early adopters of AI have now rolled out cloud-based services that are bringing AI to the masses.
How are AI, deep learning, machine learning, big data, and cloud related? Can machine learning algorithms enable the use of an individual’s comprehensive biological information to predict or diagnose diseases, and to find or develop the best therapy for that individual? How is Quantum Computing in the Cloud related to the use of AI and Cybersecurity?
Join this webinar to learn more about:
- Machine Learning, Data Discovery and Cloud
- Cloud-Based ML Applications and ML services from AWS and Google Cloud
- How to Automate Machine Learning
SmartData Webinar: Cognitive Computing in the Mobile App EconomyDATAVERSITY
Mobility is transforming work and life throughout the planet. Mobile apps--built for a growing range of handhelds, wearables, Internet of Things, and other platforms--are becoming the universal access paths to commerce, content, and community in the 21st century. The app economy refers to this new world where every decision, action, exploration, and experience is continuously enriched and optimized through the cloud-served apps that accompany you everywhere. In this webinar, James Kobielus, IBM's Big Data Evangelist, will discuss the potential of cognitive computing to super-power the emerging app economy. In addition to providing an overview of IBM's Watson strategy for cognitive computing, Kobielus will go in-depth on IBM's strategic partnership with Apple to draw on the strengths of each company to transform enterprise mobility through a new class of apps that leverage IBM’s Watson-based big data analytics cloud and add value to Apple's iPhone and iPad platforms in diverse industries.
https://www.learntek.org/blog/top-10-technology-trends-in-2019/
Learntek is global online training provider on Big Data Analytics, Hadoop, Machine Learning, Deep Learning, IOT, AI, Cloud Technology, DEVOPS, Digital Marketing and other IT and Management courses.
What if Things Start to Think - Artificial Intelligence in IoTMuralidhar Somisetty
Artificial intelligence will be functionally necessary to wield the vast number of connected “things” online, and will be even more important in making sense of an almost endless sea of data streamed in from these devices.
KM - Cognitive Computing overview by Ken Martin 13Apr2016HCL Technologies
This document provides an introduction to cognitive computing and how it relates to knowledge management strategies. It begins with an overview of Ken Martin's background and the agenda. It then defines key cognitive computing concepts and technologies like natural language processing, machine learning, and pattern recognition. The document contrasts traditional and cognitive systems, noting cognitive systems are interactive, self-learning, and expand conversations. It maps cognitive capabilities to the KM lifecycle, showing how capabilities like natural language processing, text mining, and social network analysis can enhance each stage.
The document describes an AI-driven Occupational Skills Generator (AIOSG) that aims to automate the process of creating occupational skills reference documents. The AIOSG utilizes an intelligent web crawler, natural language processing, neural networks, and a blockchain to gather data on occupational skills from various sources, analyze the data, and generate standardized skills reference documents. It is intended to reduce the time and resources required to manually produce these documents while ensuring more comprehensive and up-to-date skills information. The AIOSG system architecture and its use of analytics, artificial intelligence, and blockchain technologies are explained in detail.
Semantic Computing will make the Internet of Things 2Bob Connell
Semantic computing will help unlock the full potential of data collected by internet-connected devices by adding context and meaning. It can process both structured and unstructured data from a variety of sources to provide quicker analysis. This will allow organizations to gain deeper customer insights, access more complete information for decision making, better predict outcomes, continuously improve products/services, and extend specialist expertise. By utilizing semantic computing to analyze data from internet-connected devices more effectively, organizations can realize greater value from their internet of things implementations.
This document provides an executive overview of Semantic Software, including:
- Semantic Software is an Australian software company that has developed a cognitive computing solution based on semantic computing and owns a comprehensive patent portfolio.
- The problem is that the world has an overabundance of data but current software is inadequate for analyzing and making sense of it.
- The proposed solution is cognitive computing, which uses techniques like semantic computing, deep learning, machine learning, and natural language processing to allow computers to do the heavy lifting of data analysis. Semantic Software claims to have one of the only comprehensive semantic computing platforms.
Artificial Intelligence and Cognitive ComputingFlorian Georg
Keynote talk with some high level introduction on A.I., Cognitive systems, Machine Learning and IBM Watson & Cloud Platform @ Datadirect IT Security Forum 2017
The Future of the IoT will be cognitive - IBM Point of ViewThorsten Schroeer
I gave this presentation at the 2nd Lake Constance Supplier Dialogue in October 2017 in Friedriechshafen/Germany as part of the German Purchasing Association.
BetaGroup - Tech Trends in 2017, a snap shot by BetaGroupMohammed Cherif
The document discusses 13 emerging technology trends for the next few years: 1) Artificial Intelligence and Machine Learning, 2) Intelligent Apps, 3) Intelligent Things, 4) Virtual and Augmented Reality, 5) Digital Twins, 6) Blockchain, 7) Conversational Systems, 8) Digital Technology Platforms, 9) Adaptive Security Architecture, 10) Mesh App & Service Architecture, 11) Humanized Big Data, 12) Physical Digital Integrations, and 13) Everything on Demand. Each trend is briefly described and examples are provided. The document cautions that accurately predicting technology's future is difficult due to potential surprises.
There are essential security considerations in the systems used by semiconductor companies like TI. Along
with other semiconductor companies, TI has recognized that IT security is highly crucial during web
application developers' system development life cycle (SDLC). The challenges faced by TI web developers
were consolidated via questionnaires starting with how risk management and secure coding can be
reinforced in SDLC; and how to achieve IT Security, PM and SDLC initiatives by developing a prototype
which was evaluated considering the aforementioned goals. This study aimed to practice NIST strategies
by integrating risk management checkpoints in the SDLC; enforce secure coding using static code analysis
tool by developing a prototype application mapped with IT Security goals, project management and SDLC
initiatives and evaluation of the impact of the proposed solution. This paper discussed how SecureTI was
able to satisfy IT Security requirements in the SDLC and PM phases.
AI in Business - Key drivers and future valueAPPANION
Artificial Intelligence is undoubtedly a hyped topic at the moment. But what is the reasoning for investors and digital platform players to bet very large amounts of money on this technology right now? To better understand the current market dynamics and to give an overview of renown predictions for the upcoming 2-3 years, we compiled a practical overview of this topic. This report covers the major driving forces of AI, assumptions for the future from the industry thought leaders as well as practical advice on how to start AI projects within your company.
5G next-generation networking paradigm with its envisioned capacity, coverage, and data transfer rates provide a developmental field for novel applications scenarios. Virtual, Mixed, and Augmented Reality will play a key role as visualization, interaction, and information delivery platforms. The recent hardware and software developments in immersive technologies including AR, VR and MR in terms of the commercial
availability of advanced headsets equipped with XR-accelerated processing units and Software Development Kits (SDKs) are significantly increasing the penetration of such devices for entertainment, corporate and industrial use. This trend creates next-generation usage models which rise serious technical challenges within all networking and software architecture levels to support the immersive digital transformation.
Ibm cognitive business_strategy_presentationdiannepatricia
IBM Cognitive Business Strategy presentation. Presented by Dianne Fodell and Jim Spohrer at the Cognitive Systems Institute Group Speaker Series call on October 8, 2015.
Ethical AI: Establish an AI/ML Governance framework addressing Reproducibility, Explainability, Bias & Accountability for Enterprise AI use-cases.
Presentation on “Open Source Enterprise AI/ML Governance” at Linux Foundation’s Open Compliance Summit, Dec 2020 (https://events.linuxfoundation.org/open-compliance-summit/)
Full article: https://towardsdatascience.com/ethical-ai-its-implications-for-enterprise-ai-use-cases-and-governance-81602078f5db
The document discusses AI and IoT, highlighting several use cases and challenges. It notes that AI and IoT are transforming how people, devices, and data interact across many domains. Specifically, it provides examples of how Philips analyzes 15PB of patient data and how AI can connect disparate IoT data. Additionally, it outlines several common use cases for applying AI to IoT in various industries like manufacturing, energy, healthcare, and more. Finally, it contrasts bare IoT with AIoT, noting that AIoT involves intelligent data processing, self-learning, autonomous decision making that enhances IoT.
Gartner identified the top 10 strategic technology trends for 2014, including mobile device diversity and management, mobile apps and applications, and the Internet of Everything. Other trends include hybrid cloud/IT as a service broker, cloud/client architecture, the era of personal cloud, software defined anything, web-scale IT, smart machines, and 3D printing. Gartner predicts these trends will have significant impacts on IT organizations through 2018 as mobile use expands, cloud adoption grows, and new technologies like smart machines and 3D printing disrupt existing models.
The document discusses factors enabling China's rise as a global artificial intelligence hub, including industries interested in employing AI, a large talent pool, a significant mobile internet market, access to high-performance computing, and supportive government policies. It notes that industries interested in AI and the breadth of talent available give China a competitive advantage. Several examples are provided of Chinese companies developing AI applications and the government supporting development of the sector.
Gene Villeneuve - Moving from descriptive to cognitive analyticsIBM Sverige
As the scope of big data rapidly expands, so does the scope of the analytics that are necessary to extract insight from that data. It is simply impossible for humans or indeed rules-based engines to take that information to action. More and more, clients need analytics to make the best decisions possible; or better yet, embed those analytics into processes to automate the decision-making process, which they simply the answers based on the questions being asked at the point of impact. In order to address these rapidly evolving needs, we need to ensure the right analytics capability are deployed to suit each situation, each point of interaction and each decision point within a process. Join this session, and learn how IBM can provide a solution for the varying types of analytics: from descriptive to predictive to prescriptive to cognitive.
This document discusses standards for the Internet of Things (IoT). It makes three key points:
1. Achieving interoperability across different industry sectors and standards bodies will be challenging due to competing business interests but is crucial for IoT success.
2. Testbeds that allow for plug-and-play testing of components from different vendors can help advance interoperability and assess new business models.
3. Viewing testbeds as a service could help recoup the large investments required and facilitate collaboration across organizations.
The document discusses how artificial intelligence and machine learning are becoming increasingly important in the financial services industry. It provides examples of how AI/ML can be used for applications like customer experience enhancement, credit decisioning, fraud detection, intelligent document processing, predictive analytics, and personalized recommendations. The document also summarizes some key AWS machine learning services that financial institutions can use to build impactful AI/ML solutions and accelerate their adoption of these technologies.
THE INTERNET OF THINGS: NEW INTEROPERABILITY, MANAGEMENT AND SECURITY CHALLENGESIJNSA Journal
The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates opportunities in numerous domains. However, this increase in connectivity creates many prominent challenges. This paper provides a survey of some of the major issues challenging the widespread adoption of the IoT. Particularly, it focuses on the interoperability, management, security and privacy issues in the IoT. It is concluded that there is a need to develop a multifaceted technology approach to IoT security, management, and privacy.
THE INTERNET OF THINGS: NEW INTEROPERABILITY, MANAGEMENT AND SECURITY CHALLENGESIJNSA Journal
The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates opportunities in numerous domains. However, this increase in connectivity creates many prominent challenges. This paper provides a survey of some of the major issues challenging the widespread adoption of the IoT. Particularly, it focuses on the interoperability, management, security and privacy issues in the IoT. It is concluded that there is a need to develop a multifaceted technology approach to IoT security,
management, and privacy.
KM - Cognitive Computing overview by Ken Martin 13Apr2016HCL Technologies
This document provides an introduction to cognitive computing and how it relates to knowledge management strategies. It begins with an overview of Ken Martin's background and the agenda. It then defines key cognitive computing concepts and technologies like natural language processing, machine learning, and pattern recognition. The document contrasts traditional and cognitive systems, noting cognitive systems are interactive, self-learning, and expand conversations. It maps cognitive capabilities to the KM lifecycle, showing how capabilities like natural language processing, text mining, and social network analysis can enhance each stage.
The document describes an AI-driven Occupational Skills Generator (AIOSG) that aims to automate the process of creating occupational skills reference documents. The AIOSG utilizes an intelligent web crawler, natural language processing, neural networks, and a blockchain to gather data on occupational skills from various sources, analyze the data, and generate standardized skills reference documents. It is intended to reduce the time and resources required to manually produce these documents while ensuring more comprehensive and up-to-date skills information. The AIOSG system architecture and its use of analytics, artificial intelligence, and blockchain technologies are explained in detail.
Semantic Computing will make the Internet of Things 2Bob Connell
Semantic computing will help unlock the full potential of data collected by internet-connected devices by adding context and meaning. It can process both structured and unstructured data from a variety of sources to provide quicker analysis. This will allow organizations to gain deeper customer insights, access more complete information for decision making, better predict outcomes, continuously improve products/services, and extend specialist expertise. By utilizing semantic computing to analyze data from internet-connected devices more effectively, organizations can realize greater value from their internet of things implementations.
This document provides an executive overview of Semantic Software, including:
- Semantic Software is an Australian software company that has developed a cognitive computing solution based on semantic computing and owns a comprehensive patent portfolio.
- The problem is that the world has an overabundance of data but current software is inadequate for analyzing and making sense of it.
- The proposed solution is cognitive computing, which uses techniques like semantic computing, deep learning, machine learning, and natural language processing to allow computers to do the heavy lifting of data analysis. Semantic Software claims to have one of the only comprehensive semantic computing platforms.
Artificial Intelligence and Cognitive ComputingFlorian Georg
Keynote talk with some high level introduction on A.I., Cognitive systems, Machine Learning and IBM Watson & Cloud Platform @ Datadirect IT Security Forum 2017
The Future of the IoT will be cognitive - IBM Point of ViewThorsten Schroeer
I gave this presentation at the 2nd Lake Constance Supplier Dialogue in October 2017 in Friedriechshafen/Germany as part of the German Purchasing Association.
BetaGroup - Tech Trends in 2017, a snap shot by BetaGroupMohammed Cherif
The document discusses 13 emerging technology trends for the next few years: 1) Artificial Intelligence and Machine Learning, 2) Intelligent Apps, 3) Intelligent Things, 4) Virtual and Augmented Reality, 5) Digital Twins, 6) Blockchain, 7) Conversational Systems, 8) Digital Technology Platforms, 9) Adaptive Security Architecture, 10) Mesh App & Service Architecture, 11) Humanized Big Data, 12) Physical Digital Integrations, and 13) Everything on Demand. Each trend is briefly described and examples are provided. The document cautions that accurately predicting technology's future is difficult due to potential surprises.
There are essential security considerations in the systems used by semiconductor companies like TI. Along
with other semiconductor companies, TI has recognized that IT security is highly crucial during web
application developers' system development life cycle (SDLC). The challenges faced by TI web developers
were consolidated via questionnaires starting with how risk management and secure coding can be
reinforced in SDLC; and how to achieve IT Security, PM and SDLC initiatives by developing a prototype
which was evaluated considering the aforementioned goals. This study aimed to practice NIST strategies
by integrating risk management checkpoints in the SDLC; enforce secure coding using static code analysis
tool by developing a prototype application mapped with IT Security goals, project management and SDLC
initiatives and evaluation of the impact of the proposed solution. This paper discussed how SecureTI was
able to satisfy IT Security requirements in the SDLC and PM phases.
AI in Business - Key drivers and future valueAPPANION
Artificial Intelligence is undoubtedly a hyped topic at the moment. But what is the reasoning for investors and digital platform players to bet very large amounts of money on this technology right now? To better understand the current market dynamics and to give an overview of renown predictions for the upcoming 2-3 years, we compiled a practical overview of this topic. This report covers the major driving forces of AI, assumptions for the future from the industry thought leaders as well as practical advice on how to start AI projects within your company.
5G next-generation networking paradigm with its envisioned capacity, coverage, and data transfer rates provide a developmental field for novel applications scenarios. Virtual, Mixed, and Augmented Reality will play a key role as visualization, interaction, and information delivery platforms. The recent hardware and software developments in immersive technologies including AR, VR and MR in terms of the commercial
availability of advanced headsets equipped with XR-accelerated processing units and Software Development Kits (SDKs) are significantly increasing the penetration of such devices for entertainment, corporate and industrial use. This trend creates next-generation usage models which rise serious technical challenges within all networking and software architecture levels to support the immersive digital transformation.
Ibm cognitive business_strategy_presentationdiannepatricia
IBM Cognitive Business Strategy presentation. Presented by Dianne Fodell and Jim Spohrer at the Cognitive Systems Institute Group Speaker Series call on October 8, 2015.
Ethical AI: Establish an AI/ML Governance framework addressing Reproducibility, Explainability, Bias & Accountability for Enterprise AI use-cases.
Presentation on “Open Source Enterprise AI/ML Governance” at Linux Foundation’s Open Compliance Summit, Dec 2020 (https://events.linuxfoundation.org/open-compliance-summit/)
Full article: https://towardsdatascience.com/ethical-ai-its-implications-for-enterprise-ai-use-cases-and-governance-81602078f5db
The document discusses AI and IoT, highlighting several use cases and challenges. It notes that AI and IoT are transforming how people, devices, and data interact across many domains. Specifically, it provides examples of how Philips analyzes 15PB of patient data and how AI can connect disparate IoT data. Additionally, it outlines several common use cases for applying AI to IoT in various industries like manufacturing, energy, healthcare, and more. Finally, it contrasts bare IoT with AIoT, noting that AIoT involves intelligent data processing, self-learning, autonomous decision making that enhances IoT.
Gartner identified the top 10 strategic technology trends for 2014, including mobile device diversity and management, mobile apps and applications, and the Internet of Everything. Other trends include hybrid cloud/IT as a service broker, cloud/client architecture, the era of personal cloud, software defined anything, web-scale IT, smart machines, and 3D printing. Gartner predicts these trends will have significant impacts on IT organizations through 2018 as mobile use expands, cloud adoption grows, and new technologies like smart machines and 3D printing disrupt existing models.
The document discusses factors enabling China's rise as a global artificial intelligence hub, including industries interested in employing AI, a large talent pool, a significant mobile internet market, access to high-performance computing, and supportive government policies. It notes that industries interested in AI and the breadth of talent available give China a competitive advantage. Several examples are provided of Chinese companies developing AI applications and the government supporting development of the sector.
Gene Villeneuve - Moving from descriptive to cognitive analyticsIBM Sverige
As the scope of big data rapidly expands, so does the scope of the analytics that are necessary to extract insight from that data. It is simply impossible for humans or indeed rules-based engines to take that information to action. More and more, clients need analytics to make the best decisions possible; or better yet, embed those analytics into processes to automate the decision-making process, which they simply the answers based on the questions being asked at the point of impact. In order to address these rapidly evolving needs, we need to ensure the right analytics capability are deployed to suit each situation, each point of interaction and each decision point within a process. Join this session, and learn how IBM can provide a solution for the varying types of analytics: from descriptive to predictive to prescriptive to cognitive.
This document discusses standards for the Internet of Things (IoT). It makes three key points:
1. Achieving interoperability across different industry sectors and standards bodies will be challenging due to competing business interests but is crucial for IoT success.
2. Testbeds that allow for plug-and-play testing of components from different vendors can help advance interoperability and assess new business models.
3. Viewing testbeds as a service could help recoup the large investments required and facilitate collaboration across organizations.
The document discusses how artificial intelligence and machine learning are becoming increasingly important in the financial services industry. It provides examples of how AI/ML can be used for applications like customer experience enhancement, credit decisioning, fraud detection, intelligent document processing, predictive analytics, and personalized recommendations. The document also summarizes some key AWS machine learning services that financial institutions can use to build impactful AI/ML solutions and accelerate their adoption of these technologies.
THE INTERNET OF THINGS: NEW INTEROPERABILITY, MANAGEMENT AND SECURITY CHALLENGESIJNSA Journal
The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates opportunities in numerous domains. However, this increase in connectivity creates many prominent challenges. This paper provides a survey of some of the major issues challenging the widespread adoption of the IoT. Particularly, it focuses on the interoperability, management, security and privacy issues in the IoT. It is concluded that there is a need to develop a multifaceted technology approach to IoT security, management, and privacy.
THE INTERNET OF THINGS: NEW INTEROPERABILITY, MANAGEMENT AND SECURITY CHALLENGESIJNSA Journal
The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates opportunities in numerous domains. However, this increase in connectivity creates many prominent challenges. This paper provides a survey of some of the major issues challenging the widespread adoption of the IoT. Particularly, it focuses on the interoperability, management, security and privacy issues in the IoT. It is concluded that there is a need to develop a multifaceted technology approach to IoT security,
management, and privacy.
THE INTERNET OF THINGS: NEW INTEROPERABILITY, MANAGEMENT AND SECURITY CHALLENGESIJNSA Journal
The Internet of Things (IoT) brings connectivity to about every objects found in the physical space. It
extends connectivity to everyday objects. From connected fridges, cars and cities, the IoT creates
opportunities in numerous domains. However, this increase in connectivity creates many prominent
challenges. This paper provides a survey of some of the major issues challenging the widespread adoption
of the IoT. Particularly, it focuses on the interoperability, management, security and privacy issues in the
IoT. It is concluded that there is a need to develop a multifaceted technology approach to IoT security,
management, and privacy.
Explainable AI Over the Internet of Things (IoT)_ Overview, State-Of-the-Art ...MurindanyiSudi1
Explainable AI Over the Internet of Things (IoT): Overview, State-Of-the-Art and Future Directions. The document discusses explainable AI (XAI) and its applications over the internet of things (IoT). It begins with definitions of key concepts like AI, machine learning, deep learning, explainability and IoT. It then discusses the role of XAI in improving transparency, trust and interpretability of AI models. Applications of XAI over IoT are explored, including security enhancement, healthcare, industrial and smart cities uses cases. Research challenges and future directions are also presented, such as distributed XAI and meeting 6G requirements.
This document discusses challenges and techniques for securing Internet of Things (IoT) architecture. It begins with an introduction to IoT and outlines key challenges including privacy, security, scalability, and connectivity issues that arise from the large number of interconnected devices. The document then reviews literature on techniques for securing IoT, such as using network function virtualization (NFV) and information-centric networking (ICN). It describes several proposed secure IoT architectures in detail and compares different approaches. The document concludes by discussing future directions for securing IoT architecture.
RPL AND COAP PROTOCOLS, EXPERIMENTAL ANALYSIS FOR IOT: A CASE STUDYijasuc
Internet of Things(IoT) in recent days playing a vital role in networking related applications. However,
there are several protocols available for building IoT applications, but RPL and CoAP are important
protocols.There is a customized protocol requirement for specific IoT applications, while working on
specific research problems. Further, adequate platforms are required to evaluate the performance of these
protocols. These platforms need to be configured for the protocol, which is very crucial and timeconsuming task. At present, there is no collective and precise information available to carry out this work.
This paper discusses two different open source platforms available for IoT. Also,various IoT research ideas
need to design of IoT protocols. A few IoT communication technologies are mentioned in the paper. The
detail analysis of, two common protocols, namely Routing Protocol for Low-Power Lossy Networks (RPL)
and Constrained Application layer protocol (CoAP) is carried out with respect to latency delay and packet
delivery ratio. The results, discussion and conclusion presented in this paper are useful for researchers,
who are interested to work with IoT protocols and standards.
RPL AND COAP PROTOCOLS, EXPERIMENTAL ANALYSIS FOR IOT: A CASE STUDYijasuc
Internet of Things(IoT) in recent days playing a vital role in networking related applications. However,
there are several protocols available for building IoT applications, but RPL and CoAP are important
protocols.There is a customized protocol requirement for specific IoT applications, while working on
specific research problems. Further, adequate platforms are required to evaluate the performance of these
protocols. These platforms need to be configured for the protocol, which is very crucial and timeconsuming task. At present, there is no collective and precise information available to carry out this work.
This paper discusses two different open source platforms available for IoT. Also,various IoT research ideas
need to design of IoT protocols. A few IoT communication technologies are mentioned in the paper. The
detail analysis of, two common protocols, namely Routing Protocol for Low-Power Lossy Networks (RPL)
and Constrained Application layer protocol (CoAP) is carried out with respect to latency delay and packet
delivery ratio. The results, discussion and conclusion presented in this paper are useful for researchers,
who are interested to work with IoT protocols and standards
RPL AND COAP PROTOCOLS, EXPERIMENTAL ANALYSIS FOR IOT: A CASE STUDYijasuc
Internet of Things(IoT) in recent days playing a vital role in networking related applications. However,
there are several protocols available for building IoT applications, but RPL and CoAP are important
protocols.There is a customized protocol requirement for specific IoT applications, while working on
specific research problems. Further, adequate platforms are required to evaluate the performance of these
protocols. These platforms need to be configured for the protocol, which is very crucial and timeconsuming task. At present, there is no collective and precise information available to carry out this work.
This paper discusses two different open source platforms available for IoT. Also,various IoT research ideas
need to design of IoT protocols. A few IoT communication technologies are mentioned in the paper. The
detail analysis of, two common protocols, namely Routing Protocol for Low-Power Lossy Networks (RPL)
and Constrained Application layer protocol (CoAP) is carried out with respect to latency delay and packet
delivery ratio. The results, discussion and conclusion presented in this paper are useful for researchers,
who are interested to work with IoT protocols and standards.
SECURITY AND PRIVACY AWARE PROGRAMMING MODEL FOR IOT APPLICATIONS IN CLOUD EN...ijccsa
This document summarizes a research paper on privacy-preserving techniques for IoT data in cloud environments. It introduces two differential privacy algorithms: 1) Generic differential privacy (GenDP) which provides generalized privacy protection for homogeneous and heterogeneous IoT metadata through data portioning. 2) Cluster-based differential privacy which groups similar data into clusters before defining classifiers to validate privacy. The paper evaluates these techniques and finds the cluster-based approach offers better security than customized interactive algorithms while maintaining data utility. Overall, the study presents new differential privacy methods for anonymizing IoT metadata stored in the cloud.
Simulation, modelling and packet sniffing facilities for IoT: A systematic an...IJECEIAES
Man and Machine in terms of heterogeneous devices and sensors collaborate giving birth to the Internet of Things, Internet of future. Within a short span of time 30billions intelligent devices in form of smart applications will get connected making it difficult to test and debug in terms of time and cost. Simulators play vital role in verifying application and providing security before actually deploying it in real environment. Due to constraint environment in terms of memory, computation, and energy this review paper under a single umbrella will throw insight on comprehensive and in-depth analysis keeping in mind various barriers, critical design characteristics along with the comparison of candidate simulator and packet sniffing tool. Post simulated analysis play vital role in deciding behavior of data and helping research community to satisfy quality of service parameters. This review makes it feasible to make an appropriate choice for simulators and network analyzer tool easy fulfilling needs and making IoT a reality.
The document discusses the integration of Internet of Things (IoT) and cloud computing, referred to as Cloud of Things. It identifies several key issues with this integration, such as protocol support, energy efficiency, resource allocation, identity management, and security/privacy. Potential solutions are provided for some of the issues. The conclusion discusses the need for more study on the impact of these issues based on the specific IoT application and services provided.
The document provides a survey of trust management techniques for the Internet of Things (IoT). It summarizes four key trust management techniques:
1) E-LITHE enhances DTLS security for constrained IoT devices by adding a trusted third party to share secret keys and reduce denial-of-service attacks.
2) GTRS is a graph-based recommender system that calculates trust between IoT devices based on ratings and social relationships to select trusted service providers.
3) TWGA is a trustworthy gateway architecture that establishes trusted paths between domains using device identifiers and public/private keys to authenticate and forward data packets securely.
4) TBBS monitors the behavior and trust of vehicles in an Io
This document proposes a middleware called MSOAH-IoT to address heterogeneity issues in IoT applications. The middleware is based on a service-oriented architecture and uses REST APIs to collect data from heterogeneous sensors. It introduces heterogeneous networking interfaces and has been tested on gateways running different operating systems. The middleware aims to support various smart objects using different networking interfaces and OS systems while unifying various data formats. It is implemented on a Raspberry Pi gateway to manage communications at the network edge and handle heterogeneity issues.
Design of a Hybrid Authentication Technique for User and Device Authenticatio...IRJET Journal
The document proposes a hybrid authentication technique using blockchain authentication and artificial intelligence for user and device authentication in an integrated Internet of Things (IoT) environment. It discusses challenges with IoT security and reviews existing literature on blockchain and AI approaches for IoT authentication. The proposed technique aims to create a decentralized authentication system using a blockchain to store authentication records that are then used to train a neural network model. This trained model can authenticate new users and devices in the IoT network. The document outlines the objectives, methodology, and potential benefits of the proposed hybrid authentication approach for improving IoT security.
An Event-based Middleware for Syntactical Interoperability in Internet of Th...IJECEIAES
Internet of Things (IoT) connecting sensors or devices that record physical observations of the environment and a variety of applications or other Internet services. Along with the increasing number and diversity of devices connected, there arises a problem called interoperability. One type of interoperability is syntactical interoperability, where the IoT should be able to connect all devices through various data protocols. Based on this problem, we proposed a middleware that capable of supporting interoperability by providing a multi-protocol gateway between COAP, MQTT, and WebSocket. This middleware is developed using event-based architecture by implementing publish-subscribe pattern. We also developed a system to test the performance of middleware in terms of success rate and delay delivery of data. The system consists of temperature and humidity sensors using COAP and MQTT as a publisher and web application using WebSocket as a subscriber. The results for data transmission, either from sensors or MQTT COAP has a success rate above 90%, the average delay delivery of data from sensors COAP and MQTT below 1 second, for packet loss rate varied between 0% - 25%. The interoperability testing has been done using Interoperability assessment methodology and found out that ours is qualified.
Lightweight IoT middleware for rapid application developmentTELKOMNIKA JOURNAL
Sensors connected to the cloud services equipped with data analytics has created a plethora of new type of applications ranging from personal to an industrial level forming to what is known today as Internet of Things (IoT). IoT-based system follows a pattern of data collection, data analytics, automation, and system improvement recommendations. However, most applications would have its own unique requirements in terms of the type of the smart devices, communication technologies as well as its application provisioning service. In order to enable an IoT-based system, various services are commercially available that provide services such as backend-as-a-service (BaaS) and software-as-a-service (SaaS) hosted in the cloud. This, in turn, raises the issues of security and privacy. However there is no plug-and-play IoT middleware framework that could be deployed out of the box for on-premise server. This paper aims at providing a lightweight IoT middleware that can be used to enable IoT applications owned by the individuals or organizations that effectively securing the data on-premise or in remote server. Specifically, the middleware with a standardized application programming interface (API) that could adapt to the application requirements through high level abstraction and interacts with the application service provider is proposed. Each API endpoint would be secured using Access Control List (ACL) and easily integratable with any other modules to ensure the scalability of the system as well as easing system deployment. In addition, this middleware could be deployed in a distributed manner and coordinate among themselves to fulfil the application requirements. A middleware is presented in this paper with GET and POST requests that are lightweight in size with a footprint of less than 1 KB and a round trip time of less than 1 second to facilitate rapid application development by individuals or organizations for securing IoT resources.
IRJET - Development of Cloud System for IoT ApplicationsIRJET Journal
This document discusses the development of cloud systems for IoT applications. It begins with an introduction stating that one major problem IoT faces is storing and managing vast amounts of data generated. It then reviews 6 papers related to IoT cloud platforms, cloud storage systems, developments in cloud and IoT, exploring IoT platform development, minimizing energy consumption and SLA violations in cloud data centers, and IoT data classification. The document concludes that a detailed review of 6 IoT platform development approaches was presented and a framework was proposed to help select approaches based on requirements.
The document discusses the Internet of Things (IoT), which refers to a global network of machines and devices that can interact with each other. It identifies five essential IoT technologies - radio frequency identification, wireless sensor networks, middleware, cloud computing, and IoT application software. It also examines three categories of enterprise IoT applications - monitoring and control, big data and business analytics, and information sharing and collaboration. Finally, it discusses challenges of IoT implementation including data management, privacy, security, and integration complexity.
Dissertations are among the most important pieces of work which students complete at university. And they allow you to work individually and on something that truly attracts you. Computer science is a hot field for researchers. Many topic ideas can be generated for a dissertation in this special branch of engineering.
Ph.D. Assistance serves as an external mentor to brainstorm your idea and translate that into a research model. Hiring a mentor or tutor is common and therefore let your research committee know about the same. We do not offer any writing services without the involvement of the researcher.
Learn More: https://bit.ly/3bWsGpz
Contact Us:
Website: https://www.phdassistance.com/
UK NO: +44–1143520021
India No: +91–4448137070
WhatsApp No: +91 91769 66446
Email: info@phdassistance.com
Efficient network management and security in 5G enabled internet of things us...IJECEIAES
The rise of fifth generation (5G) networks and the proliferation of internet- of-things (IoT) devices have created new opportunities for innovation and increased connectivity. However, this growth has also brought forth several challenges related to network management and security. Based on the review of literature it has been identified that majority of existing research work are limited to either addressing the network management issue or security concerns. In this paper, the proposed work has presented an integrated framework to address both network management and security concerns in 5G internet-of-things (IoT) network using a deep learning algorithm. Firstly, a joint approach of attention mechanism and long short-term memory (LSTM) model is proposed to forecast network traffic and optimization of network resources in a, service-based and user-oriented manner. The second contribution is development of reliable network attack detection system using autoencoder mechanism. Finally, a contextual model of 5G-IoT is discussed to demonstrate the scope of the proposed models quantifying the network behavior to drive predictive decision making in network resources and attack detection with performance guarantees. The experiments are conducted with respect to various statistical error analysis and other performance indicators to assess prediction capability of both traffic forecasting and attack detection model.
Similar to Role of artificial intelligence in cloud computing, IoT and SDN: Reliability and scalability issues (20)
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...IJECEIAES
Medical image analysis has witnessed significant advancements with deep learning techniques. In the domain of brain tumor segmentation, the ability to
precisely delineate tumor boundaries from magnetic resonance imaging (MRI)
scans holds profound implications for diagnosis. This study presents an ensemble convolutional neural network (CNN) with transfer learning, integrating
the state-of-the-art Deeplabv3+ architecture with the ResNet18 backbone. The
model is rigorously trained and evaluated, exhibiting remarkable performance
metrics, including an impressive global accuracy of 99.286%, a high-class accuracy of 82.191%, a mean intersection over union (IoU) of 79.900%, a weighted
IoU of 98.620%, and a Boundary F1 (BF) score of 83.303%. Notably, a detailed comparative analysis with existing methods showcases the superiority of
our proposed model. These findings underscore the model’s competence in precise brain tumor localization, underscoring its potential to revolutionize medical
image analysis and enhance healthcare outcomes. This research paves the way
for future exploration and optimization of advanced CNN models in medical
imaging, emphasizing addressing false positives and resource efficiency.
Embedded machine learning-based road conditions and driving behavior monitoringIJECEIAES
Car accident rates have increased in recent years, resulting in losses in human lives, properties, and other financial costs. An embedded machine learning-based system is developed to address this critical issue. The system can monitor road conditions, detect driving patterns, and identify aggressive driving behaviors. The system is based on neural networks trained on a comprehensive dataset of driving events, driving styles, and road conditions. The system effectively detects potential risks and helps mitigate the frequency and impact of accidents. The primary goal is to ensure the safety of drivers and vehicles. Collecting data involved gathering information on three key road events: normal street and normal drive, speed bumps, circular yellow speed bumps, and three aggressive driving actions: sudden start, sudden stop, and sudden entry. The gathered data is processed and analyzed using a machine learning system designed for limited power and memory devices. The developed system resulted in 91.9% accuracy, 93.6% precision, and 92% recall. The achieved inference time on an Arduino Nano 33 BLE Sense with a 32-bit CPU running at 64 MHz is 34 ms and requires 2.6 kB peak RAM and 139.9 kB program flash memory, making it suitable for resource-constrained embedded systems.
Advanced control scheme of doubly fed induction generator for wind turbine us...IJECEIAES
This paper describes a speed control device for generating electrical energy on an electricity network based on the doubly fed induction generator (DFIG) used for wind power conversion systems. At first, a double-fed induction generator model was constructed. A control law is formulated to govern the flow of energy between the stator of a DFIG and the energy network using three types of controllers: proportional integral (PI), sliding mode controller (SMC) and second order sliding mode controller (SOSMC). Their different results in terms of power reference tracking, reaction to unexpected speed fluctuations, sensitivity to perturbations, and resilience against machine parameter alterations are compared. MATLAB/Simulink was used to conduct the simulations for the preceding study. Multiple simulations have shown very satisfying results, and the investigations demonstrate the efficacy and power-enhancing capabilities of the suggested control system.
Neural network optimizer of proportional-integral-differential controller par...IJECEIAES
Wide application of proportional-integral-differential (PID)-regulator in industry requires constant improvement of methods of its parameters adjustment. The paper deals with the issues of optimization of PID-regulator parameters with the use of neural network technology methods. A methodology for choosing the architecture (structure) of neural network optimizer is proposed, which consists in determining the number of layers, the number of neurons in each layer, as well as the form and type of activation function. Algorithms of neural network training based on the application of the method of minimizing the mismatch between the regulated value and the target value are developed. The method of back propagation of gradients is proposed to select the optimal training rate of neurons of the neural network. The neural network optimizer, which is a superstructure of the linear PID controller, allows increasing the regulation accuracy from 0.23 to 0.09, thus reducing the power consumption from 65% to 53%. The results of the conducted experiments allow us to conclude that the created neural superstructure may well become a prototype of an automatic voltage regulator (AVR)-type industrial controller for tuning the parameters of the PID controller.
An improved modulation technique suitable for a three level flying capacitor ...IJECEIAES
This research paper introduces an innovative modulation technique for controlling a 3-level flying capacitor multilevel inverter (FCMLI), aiming to streamline the modulation process in contrast to conventional methods. The proposed
simplified modulation technique paves the way for more straightforward and
efficient control of multilevel inverters, enabling their widespread adoption and
integration into modern power electronic systems. Through the amalgamation of
sinusoidal pulse width modulation (SPWM) with a high-frequency square wave
pulse, this controlling technique attains energy equilibrium across the coupling
capacitor. The modulation scheme incorporates a simplified switching pattern
and a decreased count of voltage references, thereby simplifying the control
algorithm.
A review on features and methods of potential fishing zoneIJECEIAES
This review focuses on the importance of identifying potential fishing zones in seawater for sustainable fishing practices. It explores features like sea surface temperature (SST) and sea surface height (SSH), along with classification methods such as classifiers. The features like SST, SSH, and different classifiers used to classify the data, have been figured out in this review study. This study underscores the importance of examining potential fishing zones using advanced analytical techniques. It thoroughly explores the methodologies employed by researchers, covering both past and current approaches. The examination centers on data characteristics and the application of classification algorithms for classification of potential fishing zones. Furthermore, the prediction of potential fishing zones relies significantly on the effectiveness of classification algorithms. Previous research has assessed the performance of models like support vector machines, naïve Bayes, and artificial neural networks (ANN). In the previous result, the results of support vector machine (SVM) were 97.6% more accurate than naive Bayes's 94.2% to classify test data for fisheries classification. By considering the recent works in this area, several recommendations for future works are presented to further improve the performance of the potential fishing zone models, which is important to the fisheries community.
Electrical signal interference minimization using appropriate core material f...IJECEIAES
As demand for smaller, quicker, and more powerful devices rises, Moore's law is strictly followed. The industry has worked hard to make little devices that boost productivity. The goal is to optimize device density. Scientists are reducing connection delays to improve circuit performance. This helped them understand three-dimensional integrated circuit (3D IC) concepts, which stack active devices and create vertical connections to diminish latency and lower interconnects. Electrical involvement is a big worry with 3D integrates circuits. Researchers have developed and tested through silicon via (TSV) and substrates to decrease electrical wave involvement. This study illustrates a novel noise coupling reduction method using several electrical involvement models. A 22% drop in electrical involvement from wave-carrying to victim TSVs introduces this new paradigm and improves system performance even at higher THz frequencies.
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...IJECEIAES
Climate change's impact on the planet forced the United Nations and governments to promote green energies and electric transportation. The deployments of photovoltaic (PV) and electric vehicle (EV) systems gained stronger momentum due to their numerous advantages over fossil fuel types. The advantages go beyond sustainability to reach financial support and stability. The work in this paper introduces the hybrid system between PV and EV to support industrial and commercial plants. This paper covers the theoretical framework of the proposed hybrid system including the required equation to complete the cost analysis when PV and EV are present. In addition, the proposed design diagram which sets the priorities and requirements of the system is presented. The proposed approach allows setup to advance their power stability, especially during power outages. The presented information supports researchers and plant owners to complete the necessary analysis while promoting the deployment of clean energy. The result of a case study that represents a dairy milk farmer supports the theoretical works and highlights its advanced benefits to existing plants. The short return on investment of the proposed approach supports the paper's novelty approach for the sustainable electrical system. In addition, the proposed system allows for an isolated power setup without the need for a transmission line which enhances the safety of the electrical network
Bibliometric analysis highlighting the role of women in addressing climate ch...IJECEIAES
Fossil fuel consumption increased quickly, contributing to climate change
that is evident in unusual flooding and draughts, and global warming. Over
the past ten years, women's involvement in society has grown dramatically,
and they succeeded in playing a noticeable role in reducing climate change.
A bibliometric analysis of data from the last ten years has been carried out to
examine the role of women in addressing the climate change. The analysis's
findings discussed the relevant to the sustainable development goals (SDGs),
particularly SDG 7 and SDG 13. The results considered contributions made
by women in the various sectors while taking geographic dispersion into
account. The bibliometric analysis delves into topics including women's
leadership in environmental groups, their involvement in policymaking, their
contributions to sustainable development projects, and the influence of
gender diversity on attempts to mitigate climate change. This study's results
highlight how women have influenced policies and actions related to climate
change, point out areas of research deficiency and recommendations on how
to increase role of the women in addressing the climate change and
achieving sustainability. To achieve more successful results, this initiative
aims to highlight the significance of gender equality and encourage
inclusivity in climate change decision-making processes.
Voltage and frequency control of microgrid in presence of micro-turbine inter...IJECEIAES
The active and reactive load changes have a significant impact on voltage
and frequency. In this paper, in order to stabilize the microgrid (MG) against
load variations in islanding mode, the active and reactive power of all
distributed generators (DGs), including energy storage (battery), diesel
generator, and micro-turbine, are controlled. The micro-turbine generator is
connected to MG through a three-phase to three-phase matrix converter, and
the droop control method is applied for controlling the voltage and
frequency of MG. In addition, a method is introduced for voltage and
frequency control of micro-turbines in the transition state from gridconnected mode to islanding mode. A novel switching strategy of the matrix
converter is used for converting the high-frequency output voltage of the
micro-turbine to the grid-side frequency of the utility system. Moreover,
using the switching strategy, the low-order harmonics in the output current
and voltage are not produced, and consequently, the size of the output filter
would be reduced. In fact, the suggested control strategy is load-independent
and has no frequency conversion restrictions. The proposed approach for
voltage and frequency regulation demonstrates exceptional performance and
favorable response across various load alteration scenarios. The suggested
strategy is examined in several scenarios in the MG test systems, and the
simulation results are addressed.
Enhancing battery system identification: nonlinear autoregressive modeling fo...IJECEIAES
Precisely characterizing Li-ion batteries is essential for optimizing their
performance, enhancing safety, and prolonging their lifespan across various
applications, such as electric vehicles and renewable energy systems. This
article introduces an innovative nonlinear methodology for system
identification of a Li-ion battery, employing a nonlinear autoregressive with
exogenous inputs (NARX) model. The proposed approach integrates the
benefits of nonlinear modeling with the adaptability of the NARX structure,
facilitating a more comprehensive representation of the intricate
electrochemical processes within the battery. Experimental data collected
from a Li-ion battery operating under diverse scenarios are employed to
validate the effectiveness of the proposed methodology. The identified
NARX model exhibits superior accuracy in predicting the battery's behavior
compared to traditional linear models. This study underscores the
importance of accounting for nonlinearities in battery modeling, providing
insights into the intricate relationships between state-of-charge, voltage, and
current under dynamic conditions.
Smart grid deployment: from a bibliometric analysis to a surveyIJECEIAES
Smart grids are one of the last decades' innovations in electrical energy.
They bring relevant advantages compared to the traditional grid and
significant interest from the research community. Assessing the field's
evolution is essential to propose guidelines for facing new and future smart
grid challenges. In addition, knowing the main technologies involved in the
deployment of smart grids (SGs) is important to highlight possible
shortcomings that can be mitigated by developing new tools. This paper
contributes to the research trends mentioned above by focusing on two
objectives. First, a bibliometric analysis is presented to give an overview of
the current research level about smart grid deployment. Second, a survey of
the main technological approaches used for smart grid implementation and
their contributions are highlighted. To that effect, we searched the Web of
Science (WoS), and the Scopus databases. We obtained 5,663 documents
from WoS and 7,215 from Scopus on smart grid implementation or
deployment. With the extraction limitation in the Scopus database, 5,872 of
the 7,215 documents were extracted using a multi-step process. These two
datasets have been analyzed using a bibliometric tool called bibliometrix.
The main outputs are presented with some recommendations for future
research.
Use of analytical hierarchy process for selecting and prioritizing islanding ...IJECEIAES
One of the problems that are associated to power systems is islanding
condition, which must be rapidly and properly detected to prevent any
negative consequences on the system's protection, stability, and security.
This paper offers a thorough overview of several islanding detection
strategies, which are divided into two categories: classic approaches,
including local and remote approaches, and modern techniques, including
techniques based on signal processing and computational intelligence.
Additionally, each approach is compared and assessed based on several
factors, including implementation costs, non-detected zones, declining
power quality, and response times using the analytical hierarchy process
(AHP). The multi-criteria decision-making analysis shows that the overall
weight of passive methods (24.7%), active methods (7.8%), hybrid methods
(5.6%), remote methods (14.5%), signal processing-based methods (26.6%),
and computational intelligent-based methods (20.8%) based on the
comparison of all criteria together. Thus, it can be seen from the total weight
that hybrid approaches are the least suitable to be chosen, while signal
processing-based methods are the most appropriate islanding detection
method to be selected and implemented in power system with respect to the
aforementioned factors. Using Expert Choice software, the proposed
hierarchy model is studied and examined.
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...IJECEIAES
The power generated by photovoltaic (PV) systems is influenced by
environmental factors. This variability hampers the control and utilization of
solar cells' peak output. In this study, a single-stage grid-connected PV
system is designed to enhance power quality. Our approach employs fuzzy
logic in the direct power control (DPC) of a three-phase voltage source
inverter (VSI), enabling seamless integration of the PV connected to the
grid. Additionally, a fuzzy logic-based maximum power point tracking
(MPPT) controller is adopted, which outperforms traditional methods like
incremental conductance (INC) in enhancing solar cell efficiency and
minimizing the response time. Moreover, the inverter's real-time active and
reactive power is directly managed to achieve a unity power factor (UPF).
The system's performance is assessed through MATLAB/Simulink
implementation, showing marked improvement over conventional methods,
particularly in steady-state and varying weather conditions. For solar
irradiances of 500 and 1,000 W/m2
, the results show that the proposed
method reduces the total harmonic distortion (THD) of the injected current
to the grid by approximately 46% and 38% compared to conventional
methods, respectively. Furthermore, we compare the simulation results with
IEEE standards to evaluate the system's grid compatibility.
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...IJECEIAES
Photovoltaic systems have emerged as a promising energy resource that
caters to the future needs of society, owing to their renewable, inexhaustible,
and cost-free nature. The power output of these systems relies on solar cell
radiation and temperature. In order to mitigate the dependence on
atmospheric conditions and enhance power tracking, a conventional
approach has been improved by integrating various methods. To optimize
the generation of electricity from solar systems, the maximum power point
tracking (MPPT) technique is employed. To overcome limitations such as
steady-state voltage oscillations and improve transient response, two
traditional MPPT methods, namely fuzzy logic controller (FLC) and perturb
and observe (P&O), have been modified. This research paper aims to
simulate and validate the step size of the proposed modified P&O and FLC
techniques within the MPPT algorithm using MATLAB/Simulink for
efficient power tracking in photovoltaic systems.
Adaptive synchronous sliding control for a robot manipulator based on neural ...IJECEIAES
Robot manipulators have become important equipment in production lines, medical fields, and transportation. Improving the quality of trajectory tracking for
robot hands is always an attractive topic in the research community. This is a
challenging problem because robot manipulators are complex nonlinear systems
and are often subject to fluctuations in loads and external disturbances. This
article proposes an adaptive synchronous sliding control scheme to improve trajectory tracking performance for a robot manipulator. The proposed controller
ensures that the positions of the joints track the desired trajectory, synchronize
the errors, and significantly reduces chattering. First, the synchronous tracking
errors and synchronous sliding surfaces are presented. Second, the synchronous
tracking error dynamics are determined. Third, a robust adaptive control law is
designed,the unknown components of the model are estimated online by the neural network, and the parameters of the switching elements are selected by fuzzy
logic. The built algorithm ensures that the tracking and approximation errors
are ultimately uniformly bounded (UUB). Finally, the effectiveness of the constructed algorithm is demonstrated through simulation and experimental results.
Simulation and experimental results show that the proposed controller is effective with small synchronous tracking errors, and the chattering phenomenon is
significantly reduced.
Remote field-programmable gate array laboratory for signal acquisition and de...IJECEIAES
A remote laboratory utilizing field-programmable gate array (FPGA) technologies enhances students’ learning experience anywhere and anytime in embedded system design. Existing remote laboratories prioritize hardware access and visual feedback for observing board behavior after programming, neglecting comprehensive debugging tools to resolve errors that require internal signal acquisition. This paper proposes a novel remote embeddedsystem design approach targeting FPGA technologies that are fully interactive via a web-based platform. Our solution provides FPGA board access and debugging capabilities beyond the visual feedback provided by existing remote laboratories. We implemented a lab module that allows users to seamlessly incorporate into their FPGA design. The module minimizes hardware resource utilization while enabling the acquisition of a large number of data samples from the signal during the experiments by adaptively compressing the signal prior to data transmission. The results demonstrate an average compression ratio of 2.90 across three benchmark signals, indicating efficient signal acquisition and effective debugging and analysis. This method allows users to acquire more data samples than conventional methods. The proposed lab allows students to remotely test and debug their designs, bridging the gap between theory and practice in embedded system design.
Detecting and resolving feature envy through automated machine learning and m...IJECEIAES
Efficiently identifying and resolving code smells enhances software project quality. This paper presents a novel solution, utilizing automated machine learning (AutoML) techniques, to detect code smells and apply move method refactoring. By evaluating code metrics before and after refactoring, we assessed its impact on coupling, complexity, and cohesion. Key contributions of this research include a unique dataset for code smell classification and the development of models using AutoGluon for optimal performance. Furthermore, the study identifies the top 20 influential features in classifying feature envy, a well-known code smell, stemming from excessive reliance on external classes. We also explored how move method refactoring addresses feature envy, revealing reduced coupling and complexity, and improved cohesion, ultimately enhancing code quality. In summary, this research offers an empirical, data-driven approach, integrating AutoML and move method refactoring to optimize software project quality. Insights gained shed light on the benefits of refactoring on code quality and the significance of specific features in detecting feature envy. Future research can expand to explore additional refactoring techniques and a broader range of code metrics, advancing software engineering practices and standards.
Smart monitoring technique for solar cell systems using internet of things ba...IJECEIAES
Rapidly and remotely monitoring and receiving the solar cell systems status parameters, solar irradiance, temperature, and humidity, are critical issues in enhancement their efficiency. Hence, in the present article an improved smart prototype of internet of things (IoT) technique based on embedded system through NodeMCU ESP8266 (ESP-12E) was carried out experimentally. Three different regions at Egypt; Luxor, Cairo, and El-Beheira cities were chosen to study their solar irradiance profile, temperature, and humidity by the proposed IoT system. The monitoring data of solar irradiance, temperature, and humidity were live visualized directly by Ubidots through hypertext transfer protocol (HTTP) protocol. The measured solar power radiation in Luxor, Cairo, and El-Beheira ranged between 216-1000, 245-958, and 187-692 W/m 2 respectively during the solar day. The accuracy and rapidity of obtaining monitoring results using the proposed IoT system made it a strong candidate for application in monitoring solar cell systems. On the other hand, the obtained solar power radiation results of the three considered regions strongly candidate Luxor and Cairo as suitable places to build up a solar cells system station rather than El-Beheira.
An efficient security framework for intrusion detection and prevention in int...IJECEIAES
Over the past few years, the internet of things (IoT) has advanced to connect billions of smart devices to improve quality of life. However, anomalies or malicious intrusions pose several security loopholes, leading to performance degradation and threat to data security in IoT operations. Thereby, IoT security systems must keep an eye on and restrict unwanted events from occurring in the IoT network. Recently, various technical solutions based on machine learning (ML) models have been derived towards identifying and restricting unwanted events in IoT. However, most ML-based approaches are prone to miss-classification due to inappropriate feature selection. Additionally, most ML approaches applied to intrusion detection and prevention consider supervised learning, which requires a large amount of labeled data to be trained. Consequently, such complex datasets are impossible to source in a large network like IoT. To address this problem, this proposed study introduces an efficient learning mechanism to strengthen the IoT security aspects. The proposed algorithm incorporates supervised and unsupervised approaches to improve the learning models for intrusion detection and mitigation. Compared with the related works, the experimental outcome shows that the model performs well in a benchmark dataset. It accomplishes an improved detection accuracy of approximately 99.21%.
Use PyCharm for remote debugging of WSL on a Windo cf5c162d672e4e58b4dde5d797...shadow0702a
This document serves as a comprehensive step-by-step guide on how to effectively use PyCharm for remote debugging of the Windows Subsystem for Linux (WSL) on a local Windows machine. It meticulously outlines several critical steps in the process, starting with the crucial task of enabling permissions, followed by the installation and configuration of WSL.
The guide then proceeds to explain how to set up the SSH service within the WSL environment, an integral part of the process. Alongside this, it also provides detailed instructions on how to modify the inbound rules of the Windows firewall to facilitate the process, ensuring that there are no connectivity issues that could potentially hinder the debugging process.
The document further emphasizes on the importance of checking the connection between the Windows and WSL environments, providing instructions on how to ensure that the connection is optimal and ready for remote debugging.
It also offers an in-depth guide on how to configure the WSL interpreter and files within the PyCharm environment. This is essential for ensuring that the debugging process is set up correctly and that the program can be run effectively within the WSL terminal.
Additionally, the document provides guidance on how to set up breakpoints for debugging, a fundamental aspect of the debugging process which allows the developer to stop the execution of their code at certain points and inspect their program at those stages.
Finally, the document concludes by providing a link to a reference blog. This blog offers additional information and guidance on configuring the remote Python interpreter in PyCharm, providing the reader with a well-rounded understanding of the process.
Software Engineering and Project Management - Software Testing + Agile Method...Prakhyath Rai
Software Testing: A Strategic Approach to Software Testing, Strategic Issues, Test Strategies for Conventional Software, Test Strategies for Object -Oriented Software, Validation Testing, System Testing, The Art of Debugging.
Agile Methodology: Before Agile – Waterfall, Agile Development.
Prediction of Electrical Energy Efficiency Using Information on Consumer's Ac...PriyankaKilaniya
Energy efficiency has been important since the latter part of the last century. The main object of this survey is to determine the energy efficiency knowledge among consumers. Two separate districts in Bangladesh are selected to conduct the survey on households and showrooms about the energy and seller also. The survey uses the data to find some regression equations from which it is easy to predict energy efficiency knowledge. The data is analyzed and calculated based on five important criteria. The initial target was to find some factors that help predict a person's energy efficiency knowledge. From the survey, it is found that the energy efficiency awareness among the people of our country is very low. Relationships between household energy use behaviors are estimated using a unique dataset of about 40 households and 20 showrooms in Bangladesh's Chapainawabganj and Bagerhat districts. Knowledge of energy consumption and energy efficiency technology options is found to be associated with household use of energy conservation practices. Household characteristics also influence household energy use behavior. Younger household cohorts are more likely to adopt energy-efficient technologies and energy conservation practices and place primary importance on energy saving for environmental reasons. Education also influences attitudes toward energy conservation in Bangladesh. Low-education households indicate they primarily save electricity for the environment while high-education households indicate they are motivated by environmental concerns.
Null Bangalore | Pentesters Approach to AWS IAMDivyanshu
#Abstract:
- Learn more about the real-world methods for auditing AWS IAM (Identity and Access Management) as a pentester. So let us proceed with a brief discussion of IAM as well as some typical misconfigurations and their potential exploits in order to reinforce the understanding of IAM security best practices.
- Gain actionable insights into AWS IAM policies and roles, using hands on approach.
#Prerequisites:
- Basic understanding of AWS services and architecture
- Familiarity with cloud security concepts
- Experience using the AWS Management Console or AWS CLI.
- For hands on lab create account on [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
# Scenario Covered:
- Basics of IAM in AWS
- Implementing IAM Policies with Least Privilege to Manage S3 Bucket
- Objective: Create an S3 bucket with least privilege IAM policy and validate access.
- Steps:
- Create S3 bucket.
- Attach least privilege policy to IAM user.
- Validate access.
- Exploiting IAM PassRole Misconfiguration
-Allows a user to pass a specific IAM role to an AWS service (ec2), typically used for service access delegation. Then exploit PassRole Misconfiguration granting unauthorized access to sensitive resources.
- Objective: Demonstrate how a PassRole misconfiguration can grant unauthorized access.
- Steps:
- Allow user to pass IAM role to EC2.
- Exploit misconfiguration for unauthorized access.
- Access sensitive resources.
- Exploiting IAM AssumeRole Misconfiguration with Overly Permissive Role
- An overly permissive IAM role configuration can lead to privilege escalation by creating a role with administrative privileges and allow a user to assume this role.
- Objective: Show how overly permissive IAM roles can lead to privilege escalation.
- Steps:
- Create role with administrative privileges.
- Allow user to assume the role.
- Perform administrative actions.
- Differentiation between PassRole vs AssumeRole
Try at [killercoda.com](https://killercoda.com/cloudsecurity-scenario/)
Discover the latest insights on Data Driven Maintenance with our comprehensive webinar presentation. Learn about traditional maintenance challenges, the right approach to utilizing data, and the benefits of adopting a Data Driven Maintenance strategy. Explore real-world examples, industry best practices, and innovative solutions like FMECA and the D3M model. This presentation, led by expert Jules Oudmans, is essential for asset owners looking to optimize their maintenance processes and leverage digital technologies for improved efficiency and performance. Download now to stay ahead in the evolving maintenance landscape.
Digital Twins Computer Networking Paper Presentation.pptxaryanpankaj78
A Digital Twin in computer networking is a virtual representation of a physical network, used to simulate, analyze, and optimize network performance and reliability. It leverages real-time data to enhance network management, predict issues, and improve decision-making processes.
Role of artificial intelligence in cloud computing, IoT and SDN: Reliability and scalability issues
1. International Journal of Electrical and Computer Engineering (IJECE)
Vol. 11, No. 5, October 2021, pp. 4458~4470
ISSN: 2088-8708, DOI: 10.11591/ijece.v11i5.pp4458-4470 4458
Journal homepage: http://ijece.iaescore.com
Role of artificial intelligence in cloud computing, IoT and SDN:
Reliability and scalability issues
Mohammad Riyaz Belgaum1
, Zainab Alansari2
, Shahrulniza Musa3
, Muhammad Mansoor Alam4
,
M. S. Mazliham5
1,3,4
Malaysian Institute of Information Technology, Universiti Kuala Lumpur, Malaysia
2
Faculty of Information Technology, Majan University College, Oman
2
Faculty of Computer Science & Information Technology, University of Malaya, Malaysia
4
Institute of Business Management, Pakistan
5
Malaysia France Institute, Universiti Kuala Lumpur, Malaysia
Article Info ABSTRACT
Article history:
Received Aug 9, 2020
Revised Dec 16, 2020
Accepted Jan 13, 2021
Information technology fields are now more dominated by artificial
intelligence, as it is playing a key role in terms of providing better services.
The inherent strengths of artificial intelligence are driving the companies into
a modern, decisive, secure, and insight-driven arena to address the current
and future challenges. The key technologies like cloud, internet of things
(IoT), and software-defined networking (SDN) are emerging as future
applications and rendering benefits to the society. Integrating artificial
intelligence with these innovations with scalability brings beneficiaries to the
next level of efficiency. Data generated from the heterogeneous devices are
received, exchanged, stored, managed, and analyzed to automate and
improve the performance of the overall system and be more reliable.
Although these new technologies are not free of their limitations,
nevertheless, the synthesis of technologies has been challenged and has put
forth many challenges in terms of scalability and reliability. Therefore, this
paper discusses the role of artificial intelligence (AI) along with issues and
opportunities confronting all communities for incorporating the integration of
these technologies in terms of reliability and scalability. This paper puts
forward the future directions related to scalability and reliability concerns
during the integration of the above-mentioned technologies and enable the
researchers to address the current research gaps.
Keywords:
Artificial intelligence
Cloud computing
IoT
Reliability
Scalability
Software-defined networking
This is an open access article under the CC BY-SA license.
Corresponding Author:
Shahrulniza Musa
Malaysian Institute of Information Technology
Universiti Kuala Lumpur
1016 Jalan Sultan Ismail, 50250 Kuala Lumpur, Wilayah Persekutuan Kuala Lumpur, Malaysia
Email: shahrulniza@unikl.edu.my
1. INTRODUCTION
Artificial intelligence (AI) consciousness is playing a key role in changing the paradigm of every
single noteworthy region of computing and innovation. AI works to create systems that can learn to mimic
human behaviour from prior experience and without manual intervention. AI, which was once said to be rigid
and a language of extreme computer intellectuals, has now become flexible and is being used in many
areas [1]. AI has a lot of potentials, and what is being utilized at present is far less than what it has. AI
reasoning enables various advancements by gathering the chronicled information, to play out a careful
examination of them, distinguish the examples, and manual to make ongoing decisions. With the assistance
2. Int J Elec & Comp Eng ISSN: 2088-8708
Role of artificial intelligence in cloud computing, IoT and SDN: Reliability… (Mohammad Riyaz Belgaum)
4459
of AI logic, it helps the consumer automate the method and increases knowledge by removing the plausibility
of mistakes when performing the manual operation.
Industry 4.0 [2] supports the creation of a "smart factory" through process automation. The
information technology field is commanded by a few different innovations, like cloud computing, the internet
of things (IoT), and software-defined networking (SDN) with their enormous qualities in an interrelated
space. However, there are various technologies supporting industry 4.0 as shown in Figure 1, contributing to
the levels of productivity. AI with these technologies is at a revolutionary catalyst stage, where much
attention is required. Artificial intelligence leads to cognitive intelligence through knowledge, understanding,
and decision-making from past experiences. Artificial intelligence is speeding up with a high pace to achieve
a technology that enables computers and machines to work smartly to allow the smart industry. The cloud
services with IoT devices that can program following the wishes of each user are now on the brink of being
satisfied with their performance. The heterogeneity [3] of the devices should allow them to communicate
with each other with an interoperable data format. It leads to scalability and reliability with better user
satisfaction. Reliability in this context is termed as the degree of trustworthiness by performing the joint
services consistently and is an indicator stating what the systems percentage susceptibility to failure is.
Scalability can be defined as the capacity of the system to match the increasing requirements
comprehensively and allows the integration of growth. While here it is said to be the limit of the number of
connected devices within a scenario in utilizing the resources efficiently. Scalability is the desired property of
a service that enables the management of rising demand loads without substantial degradation of the related
quality attributes [4].
Figure 1. Technologies supporting Industry 4.0
The terms reliability and scalability are sometimes interdependent because the more the devices are
connected and starve for the resources, the less is the reliability. On the other hand, if the number of
connected devices is less, then most of the resources are left unused. So, to have proper utilization of the
resources, AI should play a significant role by predicting the required usage of resources in advance. Various
researchers are researching at integrating AI with cloud, AI with IoT and AI with SDN. But the integration of
these emerging technologies with AI has still not been exposed, whose performance can be assessed using
scalability and reliability attributes.
Across several diverse fields of technology [5], AI has powered recent developments and cloud
computing, IoT and SDN are no exception. Though there are various other metrics used to assess the
technologies reliability and scalability are more significant. Although various researchers have discussed the
reliability and scalability issues in cloud computing, IoT and SDN separately but none have discussed the
issues in the integration of these technologies. Moreover, this paper presents the reliability and scalability
testing attributes which can be used in common to assess the performance of cloud computing, IoT and SDN.
The role of AI in addressing some of the challenges of each of the above mentioned technologies are
3. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 5, October 2021: 4458 - 4470
4460
discussed. It also discusses the future directions which can be undertaken in the process of integrating the AI
with mentioned three technologies.
The remaining part of the paper is discussed in the following sections after the introduction;
Section 2 discusses the related literature. Section 3 focuses on the reliability and scalability testing attributes,
section 4 discusses reliability and scalability in artificial intelligent cloud computing, artificial intelligent IoT,
and artificial intelligent SDN. Then section 5 puts forward the research challenges and opportunities related
to scalability and reliability during the fusion of the technologies mentioned above. Finally, section 6
discusses the conclusion.
2. RELATED LITERATURE
Different scholars have researched and analyzed the convergence of these technologies, and we
present some relevant research in this segment. Reddy [6] the authors have discussed some of the challenges
in the integration of IoT with wireless sensor networks. The challenges related to security, software and
hardware have been discussed by the authors. However, the attributes used to access such integration have
not been mentioned in it. The transformation from conventional to smart industry is done in many countries
with the most significant attention. As in AlEnezi [7] survey, the authors demonstrate that mindscaping is
Kuwait's primary challenge, investment is India's biggest challenge, and for the United States, security and
privacy is the major problem to implement a smart-government. In a stable distributed infrastructure, the low
power IoT devices should use the computing resources efficiently only when needed. So, Sharma [8] the
authors' proposed controller fog nodes to respond to challenges such as usability, real-time data transmission,
scalability, stability, durability and low latency at the edge of the scalable IoT networking using blockchain-
based distributed cloud architecture with software-defined networking (SDN). In the dense wireless
networks, an active device to device communication relays on cluster formation. The services provided by
the cloud service providers can be assessed using quality of service (QoS) parameters. Gill and Buyya [9]
discussed various open issues to be addressed like failure management, workload management,
accountability, security, and physical infrastructure. Also, a conceptual model was proposed to match the
dynamic requirements of the user with its reliable services. The communication among the IoT devices can
be done using SDN and Hohlfeld [10] have put forth the challenges in data plane, control plane and
application plane focusing on scalability. The challenges listed are traffic monitoring, plane interfaces and
security including trust management and threat analysis.
Sharafeddine and Farhat [11] have proposed an approach to address reliable communication issues
and considering the scalability of devices. A heuristic approach was used to select the cluster head and
optimize the network reliability resulting in lower failure cost. In the field of electrical grids, the evolution of
smart grid technology has enabled reliable power distribution. However, it is not free of attacks, and the
anomaly detection technique proposed in [12] differentiates between a fault and disturbance and a cyber-
attack. symbolic dynamic filters (SDF) were used for the computer-efficient extraction of functions to
identify causal correlations between the sub-systems of smart grids through dynamic bayesian networks
(DBN). In high-speed networks where the attacks are increasing, the claim of authors stating the
insufficiency of machine learning solutions for having a reliable intrusion detection is supported by the
proposed approach big flow [13]. The proposed approach uses a verification model to prove its reliability by
considering scalable network traffic with high accuracy over a period.
Rao et al. [14] have discussed a homomorphic encryption technique to provide enhanced security
during the communication between cloud and consumer. In a smart weather system, the sensors are used to
collect different types of values like humidity, temperature, raindrops, and CO2 emission. The values
collected are stored in the cloud by avoiding storing in large databases. And to ensure the privacy of the data
stored encryption is used. The stored data is accessed by various types of devices which increases the
scalability. This data could be further analysed using any of the artificial intelligence technique for predicting
the weather forecast. Plantevin et al. [15] presented an architecture for smart homes by adding intelligence in
the transducer used for communication. The designed architecture collects raw data from the sensor and uses
smart unit for communication with high reliability. The single point of failure was eliminated by using the
proposed architecture and guaranteed scalability and reliability at a low cost. The attributes used to assess the
reliability and scalability in each of the technology varies though some attributes are commonly used.
3. RELIABILITY AND SCALABILITY ATTRIBUTES
The attributes used to assess the reliability and scalability have a close relationship and
dependability on each other. The following Figure 2 shows the list of attributes used to test reliability and
scalability. The reliability of a system refers to the capacity of consistently fulfilling customer service
4. Int J Elec & Comp Eng ISSN: 2088-8708
Role of artificial intelligence in cloud computing, IoT and SDN: Reliability… (Mohammad Riyaz Belgaum)
4461
requirements under the conditions of coexisting mitigating factors and increase in market volume [16]. The
reliability of a system is a diverse device assessment issue in a multi-factor environment that has universality,
heterogeneity, and uncertainty characteristics. The attributes used to test the reliability are explained in
Table 1.
Figure 2. Reliability and scalability testing attributes
Table 1. Reliability testing attributes
Attribute Explanation
Data
Consistency
The extent to which all the data items on a scale measure one construct making the data consistent on all
the devices. Consistency in data leads to better reliability.
Stability The results obtained using an instrument should be consistent on repeated testing to ensure stability.
And a highly stable system ensures reliability.
Equivalence Coherence between the responses of different device consumers or between the alternative type of
devices proves equivalence. Equivalence increases reliability.
Interoperability Through sharing data in a common standard format, the chosen device software will communicate with
on-site applications to allow the devices to interoperate.
Security The efficiency of data must be evaluated by the various organizations like NIST, COBIT and ISACA to
ensure security. The security is the most significant attribute as any security breach may result in
catastrophic incidences.
Usability The usability term applies to the degree to which a particular form of the customer may utilize a device,
service or program to accomplish the final target with performance, efficiency and reliability while
interacting with various categories of computing devices.
Availability Through sharing data in a common standard format, the chosen device software will communicate with
on-site applications to allow the devices to interoperate.
The concern to scalability is high in a centralized environment. Scalability is the ability to expand
the computing service distribution capability by raising the amount of information delivered. Scalability is
said to be provisioning and de-provisioning of devices [17]. It describes the load of the network as services
are allocated. For scalable systems, the loading slope increases linearly with the increase of resources, while
this line curve flattens out systems that do not size well. A segment between any two units that achieve
maximum capacity with minimum deployment measures scalability. The attributes related to testing the
scalability are discussed in Table 2.
4. ROLE OF ARTIFICIAL INTELLIGENCE IN CLOUD COMPUTING, IOT AND SDN
The emerging technologies discussed in this paper have various challenges to be addressed. AI plays
an important role in addressing the challenges. The related literature studies are used to summarize some of
the challenges considered in each of the technology as shown in Figure 3.
4.1. Role of AI in cloud computing
The incorporation of cloud computing into various online activities is already in progress [18]. The
amalgamation of AI with cloud computing is significantly changing the way information technology (IT) is
5. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 5, October 2021: 4458 - 4470
4462
used across businesses and other industries with recent transformations. The various cloud computing
services [19] play an essential role in the implementation of cloud-based solutions. The automated helpers
such as Siri and Amazon Alexa from Apple are examples that changed our lives every day. The integration
allows the cloud to be fitted as an artificial intelligence self-management. In IT systems, for example, AI
after thorough analysis determines how to perform repetitive tasks using knowledge discovery. AI's ability to
learn from past experiences allows the cloud to rebound from problems. The reliability of cloud services is
increased as AI enhances data processing by efficiently storing and manipulating vast volumes of data. It
enables data collection to be automated to provide customers with accurate information and triggering an
alarm if the usual behaviour is aggressive. Intelligent analysis based on the modelling of deep neural
networks can provide enterprises with much better control over their data and significant real-time impacts.
AI gives consumers more interest in software-as-a-service (SaaS). Salesforce launched an AI tool called
Einstein [20] which translates data into ideas. The tool uses customer data for its sales through advertisement
and providing advice to customers by using various social networking means.
Table 2. Scalability testing attributes
Attribute Explanation
Resource
management
Resources are very crucial and should be used efficiently and effectively. The increase in resources
increases the capacity of handling the load but the deallocation of resources should not affect the
performance of the system much. So, effective resource management supports the system in terms of
scalability.
Response time The time from which the request has been made to the time the request started processing is said to be
response time. When a greater number of devices start using the same resource then the response time
decreases as the new requests will be waiting in the queue.
Throughput The number of requests processed in a unit of time is called throughput. When multiple devices share a
common resource then the throughput decreases as the resource time has to be shared among all the devices.
Latency Latency is the time delay taken to respond to the request. Latency can be due to setup time or
communication time too.
CPU usage CPU is a very scarce resource and should be used very efficiently. When the scalability increases the CPU
usage also increases as a greater number of devices tend to use it. At the same time, the devices connected
should not fall below or exceed the threshold to have better CPU usage.
Memory usage Increase in scalability occupies more memory. The devices information and the data from the devices are
stored in the memory. High scalability results in more memory usage.
Network usage The devices are connected in a network and communicate with each other. The network usage is determined
by the link utilization between the devices. Increase in scalability increases network usage.
Figure 3. Role of AI in addressing challenges
Besides the services provided by the cloud to consumers, the use of artificial intelligence as an
artificial intelligence-as-a-service (AIaaS) [21] is also available to the users. Various fields of AI like neural
networks and machine learning, gather vast data sets to construct, train, process, and execute the models
effectively. The analysis, calculation, and statistics of workload distributed over the cloud servers are,
therefore, more readily accessible. Intelligently used are cloud resources, which can be dynamically extended
up and down based on the tool. Manually processing huge volumes of data contributes to consumer loss over
6. Int J Elec & Comp Eng ISSN: 2088-8708
Role of artificial intelligence in cloud computing, IoT and SDN: Reliability… (Mohammad Riyaz Belgaum)
4463
time. The artificial intelligence strongly supports fault tolerance [22] mechanism and tracks and handles
application malfunction through the smooth relocation of servers. Supervised and unsupervised deep learning
approaches are used to recognize and minimize loss trends in tandem with computational methods.
The integration of AI into cloud computing to make it accessible to everyone can be done by
anchoring the cloud infrastructure with AI applications by continuous integration, continuous delivery, and
phased deployments. A framework called model ops was proposed by the authors in [23] with reliable and
trusted AI for cloud-based lifecycle management to reduce and manage the risks in deployment. The
limitation of resource management under large scale use cases was left unaddressed. The software architects
are thriving to develop a reliable and scalable cloud-assisted smart factory, which relates various AI
technologies, and the logical relationship with different AI technologies. AI is powering the cloud to emerge
as a self-managing cloud. Many researchers have speculated that when AI grows more advanced, private, and
public clouds may rely on AI resources to track, control, and even self-heal in case of issue arises. Some of
the major challenges in cloud computing to ensure reliability and support scalability along with the role of AI
in providing solutions to them are listed in Table 3.
Table 3. Role of AI in addressing the challenges in cloud computing
Major challenges in
cloud computing
Role of AI in providing a solution
Infrastructure
optimization
Neural networks and fuzzy systems can be used for classification of operating status.
Machine language can be used for extracting the data from previous instances.
Evolutionary computing can be used for multi-objective optimization.
Fault management
and Resilience
Neural networks, fuzzy systems and evolutionary computing can be used for mapping the tasks
to the virtual machines.
Neural networks, fuzzy systems and machine learning can be used for prediction of the faults.
Cloud service
pricing
Neural networks, fuzzy systems and machine learning can be used for predicting market price
trend and user demand.
Neural networks can be used for mapping cost and demand for a price.
Evolutionary computing can be used for multi-objective optimization.
Load balancing Neural networks can be used for detecting load imbalance and for mapping virtual machines
and dependability issues.
Fuzzy systems can be used for specifying preferences, constraints, and management guidelines.
Evolutionary computing can be used for multi-objective optimization.
4.2. Role of AI in the internet of things
The network of various objects linked through the internet captures and shares data without human
intervention with one another is the internet of things [24]. The use of the internet to define, sense, link, and
calculate personal, technical, and social artefacts has made life better and more difficult. All the IoT
appliances produce a lot of data to obtain and use for operational performance. Artificial intelligence enters
the picture here to capture and process the vast volume of data generated by the algorithms of artificial
intelligence. Besides, these algorithms transform the data to achieve useful results that IoT devices can
enforce. We have become almost knowledgeable by integrating artificial intelligence with those apps.
Figure 4 shows the process automation in areas such as the industrial internet of things (IIoT) and
the consumer internet of things (CIoT), the introduction of artificial intelligence into IoT applications is
relevant. For example, in [25], Z. Alansari the IoT plays a key role in the delivery of patient information to a
doctor for sustainable development in the healthcare sector. The usage of IoT technologies has also extended
into developing smart cities, smart homes, and to the broader context in smart buildings. The primary
concern being automation of the operating of smart home devices using artificial intelligence and energy
consumption optimization [26], many researchers have taken this as a challenge to gain user's
satisfaction [27].
The increase of greed for comfort among the community is leading to access and share the
information between the IoT, which in turn consumes more electricity and generate heat. Though there are
many techniques to control energy consumption at households, they are still traditional. A framework has
been proposed in [28] to predict the household energy consumption using feed-forward back propagation
neural network to automate the process and ensure reliability. The goal of IoT is to provide the services under
the limitation of low power long-range (LoRa) by ensuring reliability in communication. The authors in [29],
B. Reynders focused LoRa wide-area networks (LoRaWAN) and proposed a two-step lightweight scheduling
to improve the reliability and scalability. Each channel’s transmission power and spreading factors are
evaluated dynamically and the node decides onto which channel must be used for transmission. To ensure
scalability the attribute used was throughput along with adding many nodes. And to ensure reliability the
attributes used were packet error ratio and fairness.
7. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 5, October 2021: 4458 - 4470
4464
Under IoT, protection is a big issue, because most heterogeneous networks are related and interact
with each other as they are scalable. There is still space for increasing bugs and attacks to infiltrate the
network and carry out critical operations in this phase. Big data are complemented with the related network
and social data, which are supplemented by different modes, which are focused upon physical, cyber social
media (PCS). Artificial intelligence enables users to counter a range of attack styles, as defined in [30]. Some
of the major challenges in IoT to ensure reliability and support scalability along with the role of AI in
providing solutions to them are listed in Table 4.
Figure 4. Consumer and Industrial IoT
Table 4. Role of AI in addressing the challenges in IoT
Major challenges in IoT Role of AI in providing a solution
Energy management Neural networks and fuzzy systems can be used for energy consumption prediction.
Neural networks can be used for classifying energy usage.
Machine language can be used for extracting the data from previous instances.
Security Neural networks, fuzzy systems and machine learning can be used for classification and clustering
of security patterns in intrusion detection and malware detection.
Interoperability Fuzzy systems can be used for defining the matching preferences by specifying the constraints.
Machine learning can be used for extracting knowledge from previous instances.
Neural networks can be used for mapping the tasks.
Neural networks and fuzzy systems can be used for prediction of workload.
4.3. Role of AI in Software-defined networking
Software-defined networking has become a breakthrough in taking the network architectures to the
next generation. This refines fine-grained flow management, which can configure and render networks to be
more robust [31]. The decoupling of the control plane and data plane results in having a centralized view of
the network with increased flexibility. The use of SDN in different areas like corporate and office networks,
data centre networks, broad-based networks, and internet sharing centres made it possible to transit the
conventional networking to SDN architecture. And this transition resulted in scalability issues because of the
potential SDN switch flow data explosion. The minimal flow Table capacity of the SDN switch device
significantly reduces the viability of large scale SDN deployment. The logical movement of the control
planes from the switches to a centralized location has control over the entire network and has simplified the
functionality but has also raised the issues on the scalability of the control plane as a result of virtualization.
The isolation of control plane and data plane from the switch and the concept of programming the
network itself allows networking components produced by different vendors to be interoperable. The
softwarization of networking in line with the three layers of SDN has become complicated due to the multi-
lateral SDN domains in terms of reliability. In addition to artificial intelligence, SDN problems like
scalability, efficiency, consistency, interoperability, and control, and protection overcome them and make the
networking smart. There is little power on the ternary material addressable (TCAM) transition, and the
8. Int J Elec & Comp Eng ISSN: 2088-8708
Role of artificial intelligence in cloud computing, IoT and SDN: Reliability… (Mohammad Riyaz Belgaum)
4465
controller changes the rules [32]. A rational device can adapt to potential inputs based upon past experiences.
the gated recurrent recurrent neuro network (GRU-RNN) [33], is a supervised learning model used to solve
security concerns by detecting intrusion, enabling the system to be more reliable. The error gradient is
determined, and the activity of the components in the network is tracked to a minimum. The SDN reliability
depends on the data plane reliability, control path reliability, and control plane reliability. Various authors
have proposed solutions for issues on reliability. As part of them in [34], the authors have proposed a
framework to address the issue of reliability and scalability with a case study. The framework included
reduction of redundancy in the coupling of logical and control path along with enabling virtualization in the
controller cluster, quick recovery of failures and orchestration of control messages to improve reliability and
scalability. Some of the major challenges in cloud computing to ensure reliability and support scalability
along with the role of AI in providing solutions to them are listed in Table 5.
Table 5. Role of AI in addressing the challenges in SDN
Major challenges in SDN Role of AI in providing a solution
Monitoring Fuzzy systems can be used for specifying behaviour patterns and constraints.
Neural networks, Fuzzy systems and machine learning can be used for detecting behaviour
anomalies.
Threat analysis Neural networks, fuzzy systems and machine learning can be used for threat detection and
analysis.
Trust management Neural networks, fuzzy systems and machine learning can be used for trust classification
and behaviour.
4.4. Role of AI in the fusion
The emerging technologies cloud, IoT and SDN have become game-changers in providing technical
services to the end-users. And the fusion of AI and its synchronization with cloud computing, IoT and SDN
is about to bring a significant change to the world of information technology and is going beyond the
imagination by taking it to the next level. SDN appeared as a powerful platform to improve the scalability
and versatility of IoT devices via the cloud. The progress in the field of communication, like the emergence
of 5G, has solved many issues of communications networking. Data collected from several IoT devices are
gathered and accessible in a variety of formats and configurations to create massive data to be processed in
the cloud and accessed via a vendor-free networking device. The IoT devices can exchange data with SDNs
and cloud resources, even if the network is congested or network failure happens [35]. Artificial intelligence
improves efficiency with its potential. The following Figure 5 indicates that artificial intelligence can be
better integrated with various applications, even if its results are enhanced separately with cloud, IoT, SDN,
and artificial intelligence convergence in a range of applications.
The above sections show that in tandem with each technology separately, artificial intelligence
provides a better benefit. The ability to fuse artificial intelligence with all the discussed three technologies
would enable societies to simplify operations in all aspects of digital technology and to meet the needs of
industry 4.0. Artificial intelligence software-defined cloud of things (AISDCoT) allows the users to
intelligently use the communication devices, and to securely transmit data, by implementing the SDN
standards, through storing or accessing cloud-specified data.
The various streams of AI provide better solutions for various issues in each of the technology and
have proved its effectiveness in many aspects by supporting in selection, allocation and mapping and
optimization. The fuzzy systems can be used for defining the matching preferences, constraints in the cloud
and specifying them to the connected devices via a network to attain a better solution. Fuzzy systems can also
be used for classification and clustering of security patterns in the cloud, IoT devices as well as SDN to
ensure reliability. After collecting the requirements from the users, the neural networks can be used to map
the tasks to the virtual machines in the cloud and if load imbalance occurs, detection of load imbalance with
consideration of dependability issues also can be done using neural networks. The classification of energy
usage by the IoT devices connected can be done using fuzzy systems and neural networks to support
scalability. The interconnection of devices and the connection with the cloud is done with networking
capability. SDN enables to communicate with devices keeping them vendor-free and programming the
network itself with multi-objective function. Evolutionary computing can be used to attain multi-objective
optimization in the network considering the metrics like throughput, packet loss, delay, bandwidth utilization,
and jitter contributing to improvising the quality of service. Machine learning is a subset of AI, which can be
used for extracting knowledge from previous situations and predicting future occurrences. The following
Figure 6 shows the interaction between each of the technology with the role of AI to provide reliable services
to the end-user by using scalable devices. This Figure shows feasibility in the interaction of these
technologies.
9. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 5, October 2021: 4458 - 4470
4466
Figure 5. Artificial intelligence software-defined cloud of things (AISDCoT) [36]
Figure 6. Reliability and scalability from the end-user’s perspective
5. FUTURE DIRECTIONS AND OPEN ISSUES
The goal of every technology is to provide highly reliable and scalable services and products to the
end-user, but occasional faults occur in any dynamic shared system. The following are some of the issues that
can be considered as future directions of research while improving the reliability and scalability during the
fusion of the above-mentioned technologies.
a. Security: The scalability of devices in the network may lead to an unreliable state if new devices are
added and removed without authenticity. And if the smart devices are modular, they are very vulnerable
to threats when these details are modified complex and often. So, as a future direction, an artificial
intelligence technique can be developed to identify the authenticity along with preventing from different
types of attacks to improve this incorporation.
b. Interaction: The IoT network interactions via SDN takes place among the devices, where the processing is
done in the cloud. The end-user requirements can be met with the design of better architecture, protocols,
10. Int J Elec & Comp Eng ISSN: 2088-8708
Role of artificial intelligence in cloud computing, IoT and SDN: Reliability… (Mohammad Riyaz Belgaum)
4467
and applications. Therefore, as a future direction, it is required to have efficient transmission by
proposing using suitable artificial intelligence-capable techniques in designing protocols and applications
for interaction.
c. Communication management: The devices communicate using various implementations but are restricted
to a small network or specific functional jurisdiction. Large scale devices communication can cover large
geographical areas, and such situations involve very flexible monitoring and management systems.
Therefore, as a future direction, artificial intelligence traffic control and management are required to
support reliable communication of a vast number of devices.
d. Flow Table update: SDN flow Tables have a minimum memory and can accommodate a couple of flow
inputs. Frequent changing of flow Table entries results in extra overhead and becomes unreliable. So, as a
future direction, there is a need for a mechanism, where artificial intelligence can be used to predict the
flow Table entries and removal so that it makes the devices more scalable and reliable.
e. Interoperability: The quality of service depends on the interoperability of devices. And because various
devices connect using different technologies, it lacks universal standards to lead. So, as a future direction,
intelligent and comprehensive technology is therefore required to address the interoperability problem at
all stages of communication.
f. Single point failure: Because of the modern and multi-lateral network environments, SDN becomes more
complicated, to maintain the same continuity capabilities introduces several significant challenges to the
existing network efficiency structures. While the controller allows operation and maintenance of a
network incredibly easy for a network operator, the controller requires complex reasoning and becomes
one point of failure within the network. Therefore, the artificial intelligence-enabled technique is needed
to predict the failure, and measures are to be taken to avoid such failures, which is one of the future
directions.
g. Programming: Consequently, the controller's design failures may cost the network provider incredibly
much. Ultimately, one of the primary sources of this difficulty is a network failure, since they initiate the
running of unexplored parts of code; such network errors are imminent. So, an efficient program should
identify such failures and should predict before its occurrence from the past experiences, which is another
future direction.
h. Load balancing: Wireless sensor devices communicate with each other and transmit the requests using the
SDN interface. The protocol used for communication between the device and the control layer is the open
flow protocol. The entries in the flow tables guide the order of execution. So, to dynamically allocate the
load, an efficient artificial intelligence algorithm is needed. If a customized algorithm is built, the QoS
can be improved.
i. Resource management: Efficient management of resources improves service efficiency parameters. The
maintenance of resource usage statistics information is essential to keep track of and monitor the
resources. As a future direction, the optimal resource management is to be done using an artificial
intelligence technique to improve the service assessing metrics.
j. Energy efficiency: The major concern is to attain green computing and to enable so, heat pollution from
several devices must be regulated. Artificial intelligence interference can help to conserve energy. This
would mitigate and effectively reduce the energy consumption of these devices only when necessary. For
many researchers, energy consumption reduction is an open issue and considered as a significant future
direction to control the harmful heat generated by the machines.
6. CONCLUSION
This paper discusses the role of AI in cloud computing, IoT and SDN and raises a few issues
regarding the integration of AI with cloud, IoT, and SDN, as these are today's current developments. The
methods of artificial intelligence are used widely, which makes them work with little or no human
intervention if these theories are paired with those patterns to function intelligently. Reliability and scalability
testing attributes used to assess the performance have been discussed. The role of AI in addressing the
challenges of cloud computing, IoT and SDN related to reliability and scalability have been discussed
separately and in the integration of these technologies. Finally, the authors presented future directions and
open issues to be addressed while merging the mentioned technologies. However, the training data issues,
deployment issues, and computing performance have not been considered in the integration. In the process of
solving the complexities of this fusion, the individual development problem will also be answered by
resolving the common issues of all these phenomena. This study will support consumers and researchers who
choose to perform more work on some of the above application combinations in the future.
11. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 5, October 2021: 4458 - 4470
4468
REFERENCES
[1] L. Tredinnick, “Artificial intelligence and professional roles,” Business Information Review, vol. 34, no. 1,
pp. 37-41, 2017, doi: 10.1177%2F0266382117692621.
[2] S. Mittal, M. A. Khan, D. Romero, and T. Wuest, “A critical review of smart manufacturing and Industry 4.0
maturity models: Implications for small and medium-sized enterprises (SMEs),” Journal of manufacturing systems,
vol. 49, pp. 194-214, 2018, doi: 10.1016/j.jmsy.2018.10.005.
[3] M. Noura, M. Atiquzzaman, and M. Gaedke, “Interoperability in internet of things: Taxonomies and open
challenges,” Mobile networks and applications, vol. 24, no. 3, pp. 796-809, 2019, doi: 10.1007/s11036-018-1089-9.
[4] S. Taherizadeh and V. Stankovski, “Dynamic multi-level auto-scaling rules for containerized applications,” The
Computer Journal, vol. 62, no. 2, pp. 174-197, 2019, doi: 10.1093/comjnl/bxy043.
[5] M. Saturno, V. M. Pertel, F. Deschamps, and E. de F. R. Loures, “Proposal of an automation solutions architecture
for Industry 4.0,” 24th International Conference on Production Research, Poznan, Poland, 2017.
[6] V. Reddy, “Integration of internet of things with wireless sensor network,” International Journal of Electrical and
Computer Engineering, vol. 9, no. 1, pp. 2088-8708, 2019, doi: 10.11591/ijece.v9i1.pp439-444.
[7] A. AlEnezi, Z. AlMeraj, and P. Manuel, "Challenges of IoT based Smart-government Development," 2018 21st
Saudi Computer Society National Computer Conference (NCC), Riyadh, Saudi Arabia, 2018, pp. 1-6,
doi: 10.1109/NCG.2018.8593168.
[8] P. K. Sharma, M. Chen, and J. H. Park, "A Software Defined Fog Node Based Distributed Blockchain Cloud
Architecture for IoT," in IEEE Access, vol. 6, pp. 115-124, 2018, doi: 10.1109/ACCESS.2017.2757955.
[9] S. S. Gill and R. Buyya, "Failure Management for Reliable Cloud Computing: A Taxonomy, Model, and Future
Directions," in Computing in Science and Engineering, vol. 22, no. 3, pp. 52-63, 1 May-June 2020,
doi: 10.1109/MCSE.2018.2873866.
[10] O. Hohlfeld, J. Kempf, M. Reisslein, S. Schmid, and N. Shah, "Guest Editorial Scalability Issues and Solutions for
Software Defined Networks," in IEEE Journal on Selected Areas in Communications, vol. 36, no. 12,
pp. 2595-2602, Dec. 2018, doi: 10.1109/JSAC.2018.2872214.
[11] S. Sharafeddine and O. Farhat, “A proactive scalable approach for reliable cluster formation in wireless networks
with D2D offloading,” Ad Hoc Networks, vol. 77, pp. 42-53, 2018, doi: 10.1016/j.adhoc.2018.04.010.
[12] H. Karimipour, A. Dehghantanha, R. M. Parizi, K. R. Choo, and H. Leung, "A Deep and Scalable Unsupervised
Machine Learning System for Cyber-Attack Detection in Large-Scale Smart Grids," in IEEE Access, vol. 7,
pp. 80778-80788, 2019, doi: 10.1109/ACCESS.2019.2920326.
[13] E. Viegas, A. Santin, A. Bessani, and N. Neves, “BigFlow: Real-time and reliable anomaly-based intrusion
detection for high-speed networks,” Future Generation Computer Systems, vol. 93, pp. 473-485, 2019,
doi: 10.1016/j.future.2018.09.051.
[14] Y. Narasimha Rao, P. S. Chandra, V. Revathi, and N. S. Kumar, “Providing enhanced security in IoT based smart
weather system,” Indonesian Journal of Electrical Engineering and Computer Science, vol. 18, no. 1, pp. 9-15,
2020, doi: 10.11591/ijeecs.v18.i1.pp9-15.
[15] V. Plantevin, A. Bouzouane, B. Bouchard, and S. Gaboury, “Towards a more reliable and scalable architecture for
smart home environments,” Journal of Ambient Intelligence and Humanized Computing, vol. 10, no. 7,
pp. 2645-2656, 2019, doi: 10.1007/s12652-018-0954-5.
[16] N. Shaukat et al., “A survey on consumers empowerment, communication technologies, and renewable generation
penetration within Smart Grid,” Renewable and Sustainable Energy Reviews, vol. 81, pp. 1453-1475, 2018,
doi: 10.1016/j.rser.2017.05.208.
[17] L. Bittencourt et al., “The internet of things, fog and cloud continuum: Integration and challenges,” Internet of
Things, vol. 3, pp. 134-155, 2018, doi: 10.1016/j.iot.2018.09.005.
[18] B. Varghese and R. Buyya, “Next-generation cloud computing: New trends and research directions,” Future
Generation Computer Systems, vol. 79, pp. 849-861, 2018, doi: 10.1016/j.future.2017.09.020.
[19] M. U. Bokhari, Q. Makki, and Y. K. Tamandani, “A survey on cloud computing,” in Big Data Analytics, Springer,
2018, pp. 149-164, doi: 10.1007/978-981-10-6620-7_16.
[20] E. Turban, J. Outland, D. King, J. K. Lee, T.-P. Liang, and D. C. Turban, “Intelligent (smart) E-commerce,” in
Electronic Commerce 2018, Springer, 2018, pp. 249-283, doi: 10.1007/978-3-319- 58715-8_7.
[21] S. A. Javadi, R. Cloete, J. Cobbe, M. S. A. Lee, and J. Singh, “Monitoring Misuse for Accountable’Artificial
Intelligence as a Service,” in Proceedings of the AAAI/ACM Conference on AI, Ethics, and Society, 2020,
pp. 300-306, doi: 10.1145/3375627.3375873.
[22] J. Lee and J. Gil, “Adaptive fault-tolerant scheduling strategies for mobile cloud computing,” The Journal of
Supercomputing, vol. 75, no. 8, pp. 4472-4488, 2019, doi: 10.1007/s11227-019-02745-5.
[23] W. Hummer, et al., "ModelOps: Cloud-Based Lifecycle Management for Reliable and Trusted AI," 2019 IEEE
International Conference on Cloud Engineering (IC2E), Prague, Czech Republic, 2019, pp. 113-120,
doi: 10.1109/IC2E.2019.00025.
[24] J. I. R. Molano, J. M. C. Lovelle, C. E. Montenegro, J. J. R. Granados, and R. G. Crespo, “Metamodel for
integration of internet of things, social networks, the cloud and industry 4.0,” Journal of ambient intelligence and
humanized computing, vol. 9, no. 3, pp. 709-723, 2018, doi: 10.1007/s12652-017-0469-5.
[25] Z. Alansari, S. Soomro, M. R. Belgaum, and S. Shamshirband, “The rise of Internet of Things (IoT) in big
healthcare data: review and open research issues,” in Progress in Advanced Computing and Intelligent Engineering,
Springer, 2018, pp. 675-685, doi: 10.1007/978-981-10-6875-1_66.
[26] M. M. Tumari, M. H. Suid, and M. A. Ahmad, “A modified Grey Wolf Optimizer for improving wind plant energy
12. Int J Elec & Comp Eng ISSN: 2088-8708
Role of artificial intelligence in cloud computing, IoT and SDN: Reliability… (Mohammad Riyaz Belgaum)
4469
production,” Indonesian Journal of Electrical Engineering and Computer Science, vol. 18 no. 3, 1123-1129, 2020,
doi: 10.11591/ijeecs.v18.i3.pp1123-1129.
[27] I. O. Essiet, Y. Sun, and Z. Wang, “Optimized energy consumption model for smart home using improved
differential evolution algorithm,” Energy, vol. 172, pp. 354-365, 2019, doi: 10.1016/j.energy.2019.01.137.
[28] M. Fayaz, H. Shah, A. M. Aseere, W. K. Mashwani, and A. S. Shah, “A framework for prediction of household
energy consumption using feed-forward back propagation neural network,” Technologies, vol. 7, no. 2, p. 30, 2019,
doi: 10.1155/2019/8640218.
[29] B. Reynders, Q. Wang, and P. Tuset-Peiro, X. Vilajosana and S. Pollin, "Improving Reliability and Scalability of
LoRaWANs Through Lightweight Scheduling," in IEEE Internet of Things Journal, vol. 5, no. 3, pp. 1830-1842,
June 2018, doi: 10.1109/JIOT.2018.2815150.
[30] L. Xiao, X. Wan, X. Lu, Y. Zhang, and D. Wu, "IoT Security Techniques Based on Machine Learning: How Do
IoT Devices Use AI to Enhance Security?," in IEEE Signal Processing Magazine, vol. 35, no. 5, pp. 41-49, Sept.
2018, doi: 10.1109/MSP.2018.2825478.
[31] M. R. Belgaum, S. Musa, M. M. Alam, and M. M. Su’ud, "A systematic review of load balancing techniques in software-
defined networking," in IEEE Access, vol. 8, pp. 98612-98636, 2020, doi: 10.1109/ACCESS.2020.2995849.
[32] C. Chuang, Y. Yu, and A. Pang, "Flow-Aware Routing and Forwarding for SDN Scalability in Wireless Data
Centers," in IEEE Transactions on Network and Service Management, vol. 15, no. 4, pp. 1676-1691, Dec. 2018,
doi: 10.1109/TNSM.2018.2865166.
[33] T. A. Tang, L. Mhamdi, D. McLernon, S. A. R. Zaidi, and M. Ghogho, "Deep Recurrent Neural Network for
Intrusion Detection in SDN-based Networks," 2018 4th IEEE Conference on Network Softwarization and
Workshops (NetSoft), Montreal, QC, Canada, 2018, pp. 202-206, doi: 10.1109/NETSOFT.2018.8460090.
[34] S. Song, H. Park, B. Choi, T. Choi, and H. Zhu, "Control Path Management Framework for Enhancing Software-
Defined Network (SDN) Reliability," in IEEE Transactions on Network and Service Management, vol. 14, no. 2,
pp. 302-316, June 2017, doi: 10.1109/TNSM.2017.2669082.
[35] K. E. Benson, G. Wang, N. Venkatasubramanian and Y. Kim, "Ride: A Resilient IoT Data Exchange Middleware
Leveraging SDN and Edge Cloud Resources," 2018 IEEE/ACM Third International Conference on Internet-of-
Things Design and Implementation (IoTDI), Orlando, FL, USA, 2018, pp. 72-83, doi: 10.1109/IoTDI.2018.00017.
[36] M. R. Belgaum, S. Musa, M. Alam and M. S. Mazliham, "Integration challenges of Artificial Intelligence in Cloud
Computing, Internet of Things and Software-defined networking," 2019 13th International Conference on
Mathematics, Actuarial Science, Computer Science and Statistics (MACS), Karachi, Pakistan, 2019, pp. 1-5,
doi: 10.1109/MACS48846.2019.9024828.
BIOGRAPHIES OF AUTHORS
Mohammad Riyaz Belgaum received his master’s degree with a specialization in Computer
applications in 2001 and a Master of Engineering with a specialization in Computer Science and
Engineering in 2006. He is an Oracle Database 11G Administrator Certified Professional and an
Oracle Database 11G Program with PL/SQL Certified Associate. He is currently pursuing his
PhD in Information Technology at Universiti Kuala Lumpur with a focus on Software Defined
Networks. His areas of interest include cloud computing, mobile ad hoc networking, wireless
sensor networks, Internet of Things, and bytecode analysis. He has authored more than 30
research articles published in journals and peer-reviewed conferences. He has vast teaching
experience across various countries with much focus on research.
Zainab Alansari received her B.S. degree in Computer Science with summa cum laude award in
2007 and M.S. degree in Computer System and Networking with the highest honour degree
award and the best thesis award in 2011, both from the AMA International University, Bahrain.
Since 2007, she has been a Lecturer and Instructor of Computer Science in several institutions
internationally. Zainab Alansari published more than 35 ISI and Scopus-indexed articles and
contributed to numerous conferences and seminars internationally as a presenter, keynote
speaker and session chair. Her research interests include the Internet of Things, Wireless Sensor
Networks, Big Data and Cloud Computing. She is currently a senior lecturer in Majan University
College, Muscat, Oman and completing her PhD in Computer Science from the University of
Malaya (UM) in Malaysia.
Shahrulniza Musa is the Deputy President in charge of the Academic and Technology of
Universiti Kuala Lumpur (UniKL). He started his career at UniKL since its establishment in
2002. He received his Post-Graduate Diploma in Integrated Research Study in 2005 and Doctor
of Philosophy (PhD) in Communication Network Security in 2008, both from the Faculty of
Electrical and Electronic Engineering, Loughborough University, UK. His research interest is in
Cybersecurity, IoT application, IoT security, BigData Analytics and SDN. Apart from teaching
and post-graduate supervision, he is also active in Software project consultation and
development in Business Application, Enterprise Resource Planning (ERP) and customer
relation management (CRM). His ORCID ID and SCOPUS ID are 0000-0003-4867-5085 and
52963995100, respectively.
13. ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 11, No. 5, October 2021: 4458 - 4470
4470
Muhammad Mansoor Alam received an M.E. degree in systems engineering, an M.Sc. degree
in computer science, a PhD degree in computer engineering, and a PhD degree in electrical and
electronic engineering. He is an active researcher in the field of telecommunication and network.
He has authored more than 60 research articles published in ISI indexed journals, as book
chapters, and in peer-reviewed conferences. He is also an author of the book “Study Guide of
Network Security” copyrighted by Open University Malaysia and The Open University of Hong
Kong. He is also an active reviewer for the ISI indexed journal Pertanika Journal of Science and
Technology (JST).
Mazliham Mohd Su’ud received a PhD degree in computer engineering from the Université de
La Rochelle, in 2007 and a master’s degree in electrical and electronics engineering from the
University of Montpellier, in 1993. He has been the President/CEO of Universiti Kuala Lumpur,
Malaysia for seven years from 2013. Currently, he is the President/CEO of Multimedia
University, Malaysia.