Smart cities utilize Internet of Things (IoT) devices and sensors to enhance the quality of the city
services including energy, transportation, health, and much more. They generate massive
volumes of structured and unstructured data on a daily basis. Also, social networks, such as
Twitter, Facebook, and Google+, are becoming a new source of real-time information in smart
cities. Social network users are acting as social sensors. These datasets so large and complex
are difficult to manage with conventional data management tools and methods. To become
valuable, this massive amount of data, known as 'big data,' needs to be processed and
comprehended to hold the promise of supporting a broad range of urban and smart cities
functions, including among others transportation, water, and energy consumption, pollution
surveillance, and smart city governance. In this work, we investigate how social media analytics
help to analyze smart city data collected from various social media sources, such as Twitter and
Facebook, to detect various events taking place in a smart city and identify the importance of
events and concerns of citizens regarding some events. A case scenario analyses the opinions of
users concerning the traffic in three largest cities in the UAE
Semantics-empowered Smart City applications: today and tomorrowAmit Sheth
Citation:
Amit Sheth, "Semantics-empowered Smart City applications: today and tomorrow,” Keynote presented at the The 6th Workshop on Semantics for Smarter Cities (S4SC 2015), collocated with the 14th International Semantic Web Conference (ISWC2015), Bethlehem, PA, USA. Oct 11-12, 2015.
http://kat.ee.surrey.ac.uk/wssc/index.html
Abstract: There has been a massive growth in potentially relevant physical (sensor/IoT)- cyber (Web)- social data related to activities and operations of cities and citizens. As part of our participation in smart city projects, including the EU-funded CityPulse project, we have analyzed a large number of of use cases with inputs from city administrations and end users, and developed a few early applications. In this talk, I will present some exciting smart city applications possible today and venture to speculate on some future ones where Big Data technologies and semantic computing, including the use of domain knowledge, play a critical role.
A Quintessential smart city infrastructure framework for all stakeholdersJonathan L. Tan, M.B.A.
Smart City Infrastructure Framework provides guidance to open government data and infrastructure essentials for ICT \ Telecom, Energy \ Renewable Energy, Water \ Waste Water, Transportation, Education, Health and Government Services systems
I. Smart City Drivers
Smart City Definition
Smart City Elements
II. Smart City Infrastructure Frameworks
III. Technology Ecosystem
Stakeholders
ICT Essentials
OGD
ICT for Building Automation
Smart Water
Smart Energy
Smart Transportation
Smart Education
Smart Healthcare
Smart City Services
IV. Smart City Applications
V. Smart City Systems Infrastructure
Top SC Vendors
Data privacy and security in ICT4D - Meeting Report UN Global Pulse
On May 8th, 2015 UN Global Pulse hosted a workshop on data privacy and security in technology-enabled development projects and programmes, as part of a series of events about the Nine Principles for Digital Development. This report summarizes the presentations and discussions from the workshop. http://unglobalpulse.org/blog/improving-privacy-and-data-security-ict4d-projects
The interpretation of the relationship among big data, IoT and smart city - C...Antenna Manufacturer Coco
Big data is an intangible means of production in the information society. Its concept has been repeatedly interpreted by various sectors of the society. However, many people are unclear about the relationship between the big data, the Internet of Things, and the smart city. In this regard, Tong Fang Internet of Things Industrial Technology Division Zhao Ying made a detailed interpretation of this.
Semantics-empowered Smart City applications: today and tomorrowAmit Sheth
Citation:
Amit Sheth, "Semantics-empowered Smart City applications: today and tomorrow,” Keynote presented at the The 6th Workshop on Semantics for Smarter Cities (S4SC 2015), collocated with the 14th International Semantic Web Conference (ISWC2015), Bethlehem, PA, USA. Oct 11-12, 2015.
http://kat.ee.surrey.ac.uk/wssc/index.html
Abstract: There has been a massive growth in potentially relevant physical (sensor/IoT)- cyber (Web)- social data related to activities and operations of cities and citizens. As part of our participation in smart city projects, including the EU-funded CityPulse project, we have analyzed a large number of of use cases with inputs from city administrations and end users, and developed a few early applications. In this talk, I will present some exciting smart city applications possible today and venture to speculate on some future ones where Big Data technologies and semantic computing, including the use of domain knowledge, play a critical role.
A Quintessential smart city infrastructure framework for all stakeholdersJonathan L. Tan, M.B.A.
Smart City Infrastructure Framework provides guidance to open government data and infrastructure essentials for ICT \ Telecom, Energy \ Renewable Energy, Water \ Waste Water, Transportation, Education, Health and Government Services systems
I. Smart City Drivers
Smart City Definition
Smart City Elements
II. Smart City Infrastructure Frameworks
III. Technology Ecosystem
Stakeholders
ICT Essentials
OGD
ICT for Building Automation
Smart Water
Smart Energy
Smart Transportation
Smart Education
Smart Healthcare
Smart City Services
IV. Smart City Applications
V. Smart City Systems Infrastructure
Top SC Vendors
Data privacy and security in ICT4D - Meeting Report UN Global Pulse
On May 8th, 2015 UN Global Pulse hosted a workshop on data privacy and security in technology-enabled development projects and programmes, as part of a series of events about the Nine Principles for Digital Development. This report summarizes the presentations and discussions from the workshop. http://unglobalpulse.org/blog/improving-privacy-and-data-security-ict4d-projects
The interpretation of the relationship among big data, IoT and smart city - C...Antenna Manufacturer Coco
Big data is an intangible means of production in the information society. Its concept has been repeatedly interpreted by various sectors of the society. However, many people are unclear about the relationship between the big data, the Internet of Things, and the smart city. In this regard, Tong Fang Internet of Things Industrial Technology Division Zhao Ying made a detailed interpretation of this.
Smart Data and real-world semantic web applications (2004)Amit Sheth
Probably the first recorded use of "smart data" for achieving the Semantic Web and for realizing productivity, efficiency, and effectiveness gains by using semantics to transform raw data into Smart Data.
2013 retake on this is discussed at: http://wiki.knoesis.org/index.php/Smart_Data
Physical Cyber Social Computing: An early 21st century approach to Computing ...Amit Sheth
Keynote given at WiMS 2013 Conference, June 12-14 2013, Madrid, Spain. http://aida.ii.uam.es/wims13/keynotes.php
Video of this talk at: http://videolectures.net/wims2013_sheth_physical_cyber_social_computing/
More information at: More at: http://wiki.knoesis.org/index.php/PCS
and http://knoesis.org/projects/ssw/
Replacing earlier versions: http://www.slideshare.net/apsheth/physical-cyber-social-computing & http://www.slideshare.net/apsheth/semantics-empowered-physicalcybersocial-systems-for-earthcube
Abstract: The proper role of technology to improve human experience has been discussed by visionaries and scientists from the early days of computing and electronic communication. Technology now plays an increasingly important role in facilitating and improving personal and social activities and engagements, decision making, interaction with physical and social worlds, generating insights, and just about anything that an intelligent human seeks to do. I have used the term Computing for Human Experience (CHE) [1] to capture this essential role of technology in a human centric vision. CHE emphasizes the unobtrusive, supportive and assistive role of technology in improving human experience, so that technology “takes into account the human world and allows computers themselves to disappear in the background” (Mark Weiser [2]).
In this talk, I will portray physical-cyber-social (PCS) computing that takes ideas from, and goes significantly beyond, the current progress in cyber-physical systems, socio-technical systems and cyber-social systems to support CHE [3]. I will exemplify future PCS application scenarios in healthcare and traffic management that are supported by (a) a deeper and richer semantic interdependence and interplay between sensors and devices at physical layers, (b) rich technology mediated social interactions, and (c) the gathering and application of collective intelligence characterized by massive and contextually relevant background knowledge and advanced reasoning in order to bridge machine and human perceptions. I will share an example of PCS computing using semantic perception [4], which converts low-level, heterogeneous, multimodal and contextually relevant data into high-level abstractions that can provide insights and assist humans in making complex decisions. The key proposition is to explain that PCS computing will need to move away from traditional data processing to multi-tier computation along data-information-knowledge-wisdom dimension that supports reasoning to convert data into abstractions that humans are adept at using.
[1] A. Sheth, Computing for Human Experience
[2] M. Weiser, The Computer for 21st Century
[3] A. Sheth, Semantics empowered Cyber-Physical-Social Systems
[4] C. Henson, A. Sheth, K. Thirunarayan, Semantic Perception: Converting Sensory Observations to Abstractions
In the analogue era information was scarce and came from questionnaires and sampling. Since the dawn of the digital age in 2012 far more data than ever before is stored and it is mainly collected passively, i.e. while people go about doing what they normally do, such as run their businesses, use their cell phones and conduct internet searches.
Analysts, policy makers and business people value business tendency surveys (BTS) and consumer opinion surveys (COS) specifically because the survey results are available before the corresponding (official) quantitative data. However, Big Data has begun to make inroads on areas traditionally covered by BTS and COS. It has a competitive edge over BTS and COS, as it is available in real-time, is based on all observations and does not rely on the active participation of respondents. Furthermore, Big Data has little direct production costs, because it is merely a by-product of business processes. In contrast, putting together and maintaining a sample of active respondents and collecting information through questionnaires as in the case of BTS and COS, require the upkeep of a costly infrastructure and the employment of people with scarce, specialised skills.
However, BTS and COS also have a competitive edge over Big Data in certain aspects. These aspects could broadly be put into two groups, namely 1) BTS and COS offer information that Big Data cannot supply and 2) BTS and COS do not suffer from some of the shortcomings of Big Data. The biggest competitive advantage of BTS and COS is that they measure phenomenon that Big Data does not cover. Big Data records only actual outcomes, while BTS and COS also cover unquantifiable expectations and assessments. Although Big Data often claims that it covers the whole population universe (and not only a selection) this does not necessarily prevent bias. For example, twitter feeds could be biased, because certain demographic or less activist groups are under-represented. In contrast, the research design and random sampling of BTS and COS limit their selection bias.
To remain relevant and survive, producers of BTS and COS will have to adapt and publicise their unique competitive advantage vis-à-vis Big Data in the future. The biggest shift will probably require that producers of BTS and COS make users more aware of the value of the unique forward looking information of BTS and COS (i.e. their recording of expectations about the future).
Did you know that THIS MORNING
there is more data in the world
than EVER BEFORE?!
By 2018, 40% of enterprise architecture teams will be distinguished as leaders by their primary focus on applying disruptive technologies and the power of Big Data to drive business innovation.
Presentazione di Antonio Cordella al seminario "E-Government: Teorie e Pratiche nei Paesi Maturi e in via di Sviluppo"
www.thinkinnovation.org
www.forumpa.it
Digital Authoritarianism: Implications, Ethics, and SafegaurdsAndy Aukerman
Artificial intelligence, or AI, constitutes a common motif in science fiction literature – the aspect of a “robot uprising”, where AI becomes sufficiently advanced such that it surpasses human intelligence and escapes human control. Common perceptions of AI focus on the ethical and human impacts of a malevolent, artificially intelligent agent itself. In this document, I wish to instead focus on an equally important, and, I will argue, more plausible case of sufficiently advanced AI which poses immense risk to human activity: the use of AI in conjunction with big data for authoritarian rule and population control. In these scenarios, AI has no agency, and instead serves as a sufficiently advanced and intelligent tool for human agents. Throughout this document, I summarize current and potential applications of this type of AI, explore the ethical ramifications, and last, propose and evaluate solutions and safeguards.
Al-Khouri, A.M. (2014) "Privacy in the Age of Big Data: Exploring the Role of Modern Identity Management Systems". World Journal of Social Science, Vol. 1, No. 1, pp. 37-47.
Trustworthy Sensing for Public Safety in Cloud Centric Things of Internet wit...RSIS International
The Things of Internet (TOI) are paradigm stands for
virtually interconnected objects that we can identify the objects
through its different devices and services with Wi-Fi sensing
system, computing, and communications system. All of the
implementations, applications and there services are
implemented over the TOI architecture system. It can use the
concept to get benefit over the cloud computing services. Sensing
Service (SS) is cloud-inspired service model which used to
enables access the TOI. We present a framework where TOI can
enhance public safety through crowd management system, and
also provide sensing services with different types of sensors
device are available. In order to ensure trustworthiness in the
presented framework, where users can support from there
reviews with the incentive supported of network, we can also
check the trustworthiness of data, for more trustfulness. We have
design for mobile services application to demonstrate how users
our can connect to Wi-Fi hotspots and how the networks work in
crowd sourced Wi-Fi sensing system. We propose a SS scheme
namely, Sensing for Crowd Management (SCM) for front-end
access to the TOI. To collect and share user experience through
Wi-Fi hotspots in an urban and rural area. We incorporate the
network into the system. SCM is used to sensing data on cloud
model and a procedure which used to selects mobile devices for
specific tasks and identifies the payments by users through
mobile devices which provide data. The performance of SCM
shows that the impact of more users in the crowd sourced data
can be degraded by 70% while trustworthiness of a malicious
user converges to a value below 30% following few auctions.
Moreover, we show that SCM can enhance the utility of the
public safety authority up to 80%.
Using Data and New Technology for Peacemaking, Preventive Diplomacy, and Peac...UN Global Pulse
This guide offers an overview of e-analytics in the context of peacemaking and preventive diplomacy. It presents a summary of e-analytics tools as well as examples from the peace and security field. It includes a data project planning matrix that aims to help facilitate and motivate data-driven analysis. Part of the guide is a glossary on basic terminology related to new technologies.
Analyzing Role of Big Data and IoT in Smart CitiesIJAEMSJORNAL
Big data and Internet of Things (IoT) technologies have evolved and expanded tremendously and hence play a major role in building feasible initiatives for smart city development. IoT and big data form a perfect blend in bringing an interesting and novel challenge to attain futuristic smart cities. These new challenges mainly focus on business and technology related issues that help smart cities to formulate their principles, vision, & requirements of smart city applications. In this paper, the role of big data and IoT technologies with respect to smart cities is analyzed. The benefits that smart cities will have from big data and IoT are also discussed. Various challenges faced by smart cities in general related to big data and IoT have also been described here. Moreover, the future statistics of IoT and big data with respect to smart cities is also deliberated.
What Data Can Do: A Typology of Mechanisms
Angèle Christin .
International Journal of Communication > Vol 14 (2020) , de Angèle Christin del Departamento de Comunicación de Stanford University, USA titulado "What Data Can Do: A Typology of Mechanisms". Entre otras cosas es autora del libro "Metrics at Work.
Mobile Computing, Internet of Things, and Big Data for Urban InformaticsPraveen Rao
Anirban Mondal, Praveen Rao and Sanjay K. Madria, "Mobile Computing, Internet of Things, and Big Data for Urban Informatics," 2016 17th IEEE International Conference on Mobile Data Management (MDM), Porto, 2016, pp. 8-11.
DOI: 10.1109/MDM.2016.81
‘The State of Mobile Data for Social Good’ report is a collaboration between UN Global Pulse and the GSMA, the global mobile telecommunications industry association. The report, which identifies over 200 projects or studies leveraging mobile data for social good, aims to survey the landscape today, assess the current barriers to scale, and make recommendations for a way forward. It details some of the main challenges with using mobile data for social good and provides a set of actions that (i) can spur investment and use, (ii) ensure cohesion of efforts and of customer privacy and data protection frameworks and (iii) build technical capacity.
Smart Data and real-world semantic web applications (2004)Amit Sheth
Probably the first recorded use of "smart data" for achieving the Semantic Web and for realizing productivity, efficiency, and effectiveness gains by using semantics to transform raw data into Smart Data.
2013 retake on this is discussed at: http://wiki.knoesis.org/index.php/Smart_Data
Physical Cyber Social Computing: An early 21st century approach to Computing ...Amit Sheth
Keynote given at WiMS 2013 Conference, June 12-14 2013, Madrid, Spain. http://aida.ii.uam.es/wims13/keynotes.php
Video of this talk at: http://videolectures.net/wims2013_sheth_physical_cyber_social_computing/
More information at: More at: http://wiki.knoesis.org/index.php/PCS
and http://knoesis.org/projects/ssw/
Replacing earlier versions: http://www.slideshare.net/apsheth/physical-cyber-social-computing & http://www.slideshare.net/apsheth/semantics-empowered-physicalcybersocial-systems-for-earthcube
Abstract: The proper role of technology to improve human experience has been discussed by visionaries and scientists from the early days of computing and electronic communication. Technology now plays an increasingly important role in facilitating and improving personal and social activities and engagements, decision making, interaction with physical and social worlds, generating insights, and just about anything that an intelligent human seeks to do. I have used the term Computing for Human Experience (CHE) [1] to capture this essential role of technology in a human centric vision. CHE emphasizes the unobtrusive, supportive and assistive role of technology in improving human experience, so that technology “takes into account the human world and allows computers themselves to disappear in the background” (Mark Weiser [2]).
In this talk, I will portray physical-cyber-social (PCS) computing that takes ideas from, and goes significantly beyond, the current progress in cyber-physical systems, socio-technical systems and cyber-social systems to support CHE [3]. I will exemplify future PCS application scenarios in healthcare and traffic management that are supported by (a) a deeper and richer semantic interdependence and interplay between sensors and devices at physical layers, (b) rich technology mediated social interactions, and (c) the gathering and application of collective intelligence characterized by massive and contextually relevant background knowledge and advanced reasoning in order to bridge machine and human perceptions. I will share an example of PCS computing using semantic perception [4], which converts low-level, heterogeneous, multimodal and contextually relevant data into high-level abstractions that can provide insights and assist humans in making complex decisions. The key proposition is to explain that PCS computing will need to move away from traditional data processing to multi-tier computation along data-information-knowledge-wisdom dimension that supports reasoning to convert data into abstractions that humans are adept at using.
[1] A. Sheth, Computing for Human Experience
[2] M. Weiser, The Computer for 21st Century
[3] A. Sheth, Semantics empowered Cyber-Physical-Social Systems
[4] C. Henson, A. Sheth, K. Thirunarayan, Semantic Perception: Converting Sensory Observations to Abstractions
In the analogue era information was scarce and came from questionnaires and sampling. Since the dawn of the digital age in 2012 far more data than ever before is stored and it is mainly collected passively, i.e. while people go about doing what they normally do, such as run their businesses, use their cell phones and conduct internet searches.
Analysts, policy makers and business people value business tendency surveys (BTS) and consumer opinion surveys (COS) specifically because the survey results are available before the corresponding (official) quantitative data. However, Big Data has begun to make inroads on areas traditionally covered by BTS and COS. It has a competitive edge over BTS and COS, as it is available in real-time, is based on all observations and does not rely on the active participation of respondents. Furthermore, Big Data has little direct production costs, because it is merely a by-product of business processes. In contrast, putting together and maintaining a sample of active respondents and collecting information through questionnaires as in the case of BTS and COS, require the upkeep of a costly infrastructure and the employment of people with scarce, specialised skills.
However, BTS and COS also have a competitive edge over Big Data in certain aspects. These aspects could broadly be put into two groups, namely 1) BTS and COS offer information that Big Data cannot supply and 2) BTS and COS do not suffer from some of the shortcomings of Big Data. The biggest competitive advantage of BTS and COS is that they measure phenomenon that Big Data does not cover. Big Data records only actual outcomes, while BTS and COS also cover unquantifiable expectations and assessments. Although Big Data often claims that it covers the whole population universe (and not only a selection) this does not necessarily prevent bias. For example, twitter feeds could be biased, because certain demographic or less activist groups are under-represented. In contrast, the research design and random sampling of BTS and COS limit their selection bias.
To remain relevant and survive, producers of BTS and COS will have to adapt and publicise their unique competitive advantage vis-à-vis Big Data in the future. The biggest shift will probably require that producers of BTS and COS make users more aware of the value of the unique forward looking information of BTS and COS (i.e. their recording of expectations about the future).
Did you know that THIS MORNING
there is more data in the world
than EVER BEFORE?!
By 2018, 40% of enterprise architecture teams will be distinguished as leaders by their primary focus on applying disruptive technologies and the power of Big Data to drive business innovation.
Presentazione di Antonio Cordella al seminario "E-Government: Teorie e Pratiche nei Paesi Maturi e in via di Sviluppo"
www.thinkinnovation.org
www.forumpa.it
Digital Authoritarianism: Implications, Ethics, and SafegaurdsAndy Aukerman
Artificial intelligence, or AI, constitutes a common motif in science fiction literature – the aspect of a “robot uprising”, where AI becomes sufficiently advanced such that it surpasses human intelligence and escapes human control. Common perceptions of AI focus on the ethical and human impacts of a malevolent, artificially intelligent agent itself. In this document, I wish to instead focus on an equally important, and, I will argue, more plausible case of sufficiently advanced AI which poses immense risk to human activity: the use of AI in conjunction with big data for authoritarian rule and population control. In these scenarios, AI has no agency, and instead serves as a sufficiently advanced and intelligent tool for human agents. Throughout this document, I summarize current and potential applications of this type of AI, explore the ethical ramifications, and last, propose and evaluate solutions and safeguards.
Al-Khouri, A.M. (2014) "Privacy in the Age of Big Data: Exploring the Role of Modern Identity Management Systems". World Journal of Social Science, Vol. 1, No. 1, pp. 37-47.
Trustworthy Sensing for Public Safety in Cloud Centric Things of Internet wit...RSIS International
The Things of Internet (TOI) are paradigm stands for
virtually interconnected objects that we can identify the objects
through its different devices and services with Wi-Fi sensing
system, computing, and communications system. All of the
implementations, applications and there services are
implemented over the TOI architecture system. It can use the
concept to get benefit over the cloud computing services. Sensing
Service (SS) is cloud-inspired service model which used to
enables access the TOI. We present a framework where TOI can
enhance public safety through crowd management system, and
also provide sensing services with different types of sensors
device are available. In order to ensure trustworthiness in the
presented framework, where users can support from there
reviews with the incentive supported of network, we can also
check the trustworthiness of data, for more trustfulness. We have
design for mobile services application to demonstrate how users
our can connect to Wi-Fi hotspots and how the networks work in
crowd sourced Wi-Fi sensing system. We propose a SS scheme
namely, Sensing for Crowd Management (SCM) for front-end
access to the TOI. To collect and share user experience through
Wi-Fi hotspots in an urban and rural area. We incorporate the
network into the system. SCM is used to sensing data on cloud
model and a procedure which used to selects mobile devices for
specific tasks and identifies the payments by users through
mobile devices which provide data. The performance of SCM
shows that the impact of more users in the crowd sourced data
can be degraded by 70% while trustworthiness of a malicious
user converges to a value below 30% following few auctions.
Moreover, we show that SCM can enhance the utility of the
public safety authority up to 80%.
Using Data and New Technology for Peacemaking, Preventive Diplomacy, and Peac...UN Global Pulse
This guide offers an overview of e-analytics in the context of peacemaking and preventive diplomacy. It presents a summary of e-analytics tools as well as examples from the peace and security field. It includes a data project planning matrix that aims to help facilitate and motivate data-driven analysis. Part of the guide is a glossary on basic terminology related to new technologies.
Analyzing Role of Big Data and IoT in Smart CitiesIJAEMSJORNAL
Big data and Internet of Things (IoT) technologies have evolved and expanded tremendously and hence play a major role in building feasible initiatives for smart city development. IoT and big data form a perfect blend in bringing an interesting and novel challenge to attain futuristic smart cities. These new challenges mainly focus on business and technology related issues that help smart cities to formulate their principles, vision, & requirements of smart city applications. In this paper, the role of big data and IoT technologies with respect to smart cities is analyzed. The benefits that smart cities will have from big data and IoT are also discussed. Various challenges faced by smart cities in general related to big data and IoT have also been described here. Moreover, the future statistics of IoT and big data with respect to smart cities is also deliberated.
What Data Can Do: A Typology of Mechanisms
Angèle Christin .
International Journal of Communication > Vol 14 (2020) , de Angèle Christin del Departamento de Comunicación de Stanford University, USA titulado "What Data Can Do: A Typology of Mechanisms". Entre otras cosas es autora del libro "Metrics at Work.
Mobile Computing, Internet of Things, and Big Data for Urban InformaticsPraveen Rao
Anirban Mondal, Praveen Rao and Sanjay K. Madria, "Mobile Computing, Internet of Things, and Big Data for Urban Informatics," 2016 17th IEEE International Conference on Mobile Data Management (MDM), Porto, 2016, pp. 8-11.
DOI: 10.1109/MDM.2016.81
‘The State of Mobile Data for Social Good’ report is a collaboration between UN Global Pulse and the GSMA, the global mobile telecommunications industry association. The report, which identifies over 200 projects or studies leveraging mobile data for social good, aims to survey the landscape today, assess the current barriers to scale, and make recommendations for a way forward. It details some of the main challenges with using mobile data for social good and provides a set of actions that (i) can spur investment and use, (ii) ensure cohesion of efforts and of customer privacy and data protection frameworks and (iii) build technical capacity.
Event detection in twitter using text and image fusioncsandit
In this paper, we describe an accurate and effective event detection method to detect events from
Twitter stream. It detects events using visual information as well as textual information to improve
the performance of the mining. It monitors Twitter stream to pick up tweets having texts and photos
and stores them into database. Then it applies mining algorithm to detect the event. Firstly, it detects
event based on text only by using the feature of the bag-of-words which is calculated using the term
frequency-inverse document frequency (TF-IDF) method. Secondly, it detects the event based on
image only by using visual features including histogram of oriented gradients (HOG) descriptors,
grey-level co-occurrence matrix (GLCM), and color histogram. K nearest neighbours (Knn)
classification is used in the detection. Finally, the final decision of the event detection is made based
on the reliabilities of text only detection and image only detection. The experiment result showed that
the proposed method achieved high accuracy of 0.93, comparing with 0.89 with texts only, and 0.86
with images only.
In this contribution, we develop an accurate and effective event detection method to detect events from a
Twitter stream, which uses visual and textual information to improve the performance of the mining
process. The method monitors a Twitter stream to pick up tweets having texts and images and stores them
into a database. This is followed by applying a mining algorithm to detect an event. The procedure starts
with detecting events based on text only by using the feature of the bag-of-words which is calculated using
the term frequency-inverse document frequency (TF-IDF) method. Then it detects the event based on image
only by using visual features including histogram of oriented gradients (HOG) descriptors, grey-level cooccurrence
matrix (GLCM), and color histogram. K nearest neighbours (Knn) classification is used in the
detection. The final decision of the event detection is made based on the reliabilities of text only detection
and image only detection. The experiment result showed that the proposed method achieved high accuracy
of 0.94, comparing with 0.89 with texts only, and 0.86 with images only.
Big Data an opportunity for friendly cities
Lorena Pocatilu
The Bucharest University of Economic Studies, Economic Informatics and Cybernetics Department Bucharest, Romania
lorena.pocatilu@ie.ase.ro
The use of big data solutions is the biggest opportunity for friendly cities in our years. This happened because we need to access, process and use different data type very fast and big data solutions offers these facilities.
The concept of big data which creating value is not new, and in our age the effective use of da-ta is to becoming the basis element of competition. Cities of our time have always wanted to use correctly and to the real value the information and knowledge in order to make better, smarter, real time, fact-based decisions, this necessity of correct knowledge has fueled the growth of using big data. In this case the big data concept is the most important support for cities’ evolutions. In the world, many cities who are agree that this is true aren't sure how to make the most of it implementation. After a literature review analysis, this paper presents the steps for implement the solutions of big data in the core area of cities.
More and more companies from business and administration are agree that big data is an op-portunity for friendly cities. This paper highlights with examples from all over the world that those areas which use big data have good results. The areas that succeed aren't the ones who have the most data, but the ones who use it best. Big data will fundamentally change the way cities compete and operate. Companies from business and administration that invest in and successfully derive value from their data will have a distinct advantage over their competitors — a performance gap that will continue to grow as more relevant data is generated, emerging technologies and digital channels offer better acquisition and delivery mechanisms, and the technologies that enable faster, easier data analysis continue to develop.
Investment and development are the keys of our cities. This paper presents the impact of the big data solutions and how can use all the facility of this in friendly cities development. Having in view the researches in this area the cities development using big data in accordance with sustainability principles has become an opportunity of this century. An efficient access and use of huge quantity of data through big data solutions and the involvement of citizens in the initi-atives of local communities are the key elements that a city can use to achieve a harmonious development.
The major research of this approach is centered on the necessity of use big data for friendly cities.
An Innovative, Open, Interoperable Citizen EngagementCloud P.docxgreg1eden90113
An Innovative, Open, Interoperable Citizen Engagement
Cloud Platform for Smart Government and Users’
Interaction
Diego Reforgiato Recupero1,6 & Mario Castronovo2 &
Sergio Consoli1 & Tarcisio Costanzo3 &
Aldo Gangemi1,4 & Luigi Grasso3 & Giorgia Lodi1 &
Gianluca Merendino3 & Misael Mongiovì1 &
Valentina Presutti1 & Salvatore Davide Rapisarda2 &
Salvo Rosa2 & Emanuele Spampinato5
Received: 10 November 2015 /Accepted: 20 January 2016 /
Published online: 30 January 2016
# Springer Science+Business Media New York 2016
Abstract This paper introduces an open, interoperable, and cloud-computing-based
citizen engagement platform for the management of administrative processes of public
administrations, which also increases the engagement of citizens. The citizen engage-
ment platform is the outcome of a 3-year Italian national project called PRISMA
(Interoperable cloud platforms for smart government; http://www.ponsmartcities-
prisma.it/). The aim of the project is to constitute a new model of digital ecosystem
that can support and enable new methods of interaction among public administrations,
citizens, companies, and other stakeholders surrounding cities. The platform has been
defined by the media as a flexible (enable the addition of any kind of application or
service) and open (enable access to open services) Italian Bcloud^ that allows public
administrations to access to a vast knowledge base represented as linked open data to
be reused by a stakeholder community with the aim of developing new applications
(BCloud Apps^) tailored to the specific needs of citizens. The platform has been used
by Catania and Syracuse municipalities, two of the main cities of southern Italy, located
J Knowl Econ (2016) 7:388–412
DOI 10.1007/s13132-016-0361-0
* Diego Reforgiato Recupero
[email protected]
1 National Research Council (CNR), Via Gaifami 18, 95126 Catania, Italy
2 Sielte, Via Cerza 4, 95027 San Gregorio di Catania, Italy
3 Datanet, Syracuse, Contrada Targia 58, 96100 Syracuse, Italy
4 Paris Nord University, Sorbonne Citè CNRS UMR7030, France
5 Etna Hitech, Viale Africa 31, 95129 Catania, Italy
6 Department of Mathematics and Computer Science, University of Cagliari, Cagliari, Italy
http://www.ponsmartcities-prisma.it/
http://www.ponsmartcities-prisma.it/
http://crossmark.crossref.org/dialog/?doi=10.1007/s13132-016-0361-0&domain=pdf
in the Sicilian region. The fully adoption of the platform is rapidly spreading around the
whole region (local developers have already used available application programming
interfaces (APIs) to create additional services for citizens and administrations) to such
an extent that other provinces of Sicily and Italy in general expressed their interest for
its usage. The platform is available online and, as mentioned above, is open source and
provides APIs for full exploitation.
Keywords Smartcity.Smartgovernance.Linkedopendata.Citizenengagement.Cloud
computing
Introduction
Smart governance is defined as a subset of the s.
This primer - or "Big Data 101" specifically for the international development and humanitarian communities - explains the concepts behind using Big Data for social good in easy-to-understand language. Published by the United Nations' Global Pulse initiative, which is exploring how new, digital data sources and real-time analytics technologies can help policymakers understand human well-being and emerging vulnerabilities in real-time. www.unglobalpulse.org
The objective of this module is to provide an overview of the basic information on big data.
Upon completion of this module you will:
-Comprehend the emerging role of big data
-Understand the key terms regarding big and smart data
- Know how big data can be turned into smart data
- Be able to apply the key terms regarding big data
Duration of the module: approximately 1 – 2 hours
ANALYSIS OF LAND SURFACE DEFORMATION GRADIENT BY DINSAR cscpconf
The progressive development of Synthetic Aperture Radar (SAR) systems diversify the exploitation of the generated images by these systems in different applications of geoscience. Detection and monitoring surface deformations, procreated by various phenomena had benefited from this evolution and had been realized by interferometry (InSAR) and differential interferometry (DInSAR) techniques. Nevertheless, spatial and temporal decorrelations of the interferometric couples used, limit strongly the precision of analysis results by these techniques. In this context, we propose, in this work, a methodological approach of surface deformation detection and analysis by differential interferograms to show the limits of this technique according to noise quality and level. The detectability model is generated from the deformation signatures, by simulating a linear fault merged to the images couples of ERS1 / ERS2 sensors acquired in a region of the Algerian south.
4D AUTOMATIC LIP-READING FOR SPEAKER'S FACE IDENTIFCATIONcscpconf
A novel based a trajectory-guided, concatenating approach for synthesizing high-quality image real sample renders video is proposed . The lips reading automated is seeking for modeled the closest real image sample sequence preserve in the library under the data video to the HMM predicted trajectory. The object trajectory is modeled obtained by projecting the face patterns into an KDA feature space is estimated. The approach for speaker's face identification by using synthesise the identity surface of a subject face from a small sample of patterns which sparsely each the view sphere. An KDA algorithm use to the Lip-reading image is discrimination, after that work consisted of in the low dimensional for the fundamental lip features vector is reduced by using the 2D-DCT.The mouth of the set area dimensionality is ordered by a normally reduction base on the PCA to obtain the Eigen lips approach, their proposed approach by[33]. The subjective performance results of the cost function under the automatic lips reading modeled , which wasn’t illustrate the superior performance of the
method.
MOVING FROM WATERFALL TO AGILE PROCESS IN SOFTWARE ENGINEERING CAPSTONE PROJE...cscpconf
Universities offer software engineering capstone course to simulate a real world-working environment in which students can work in a team for a fixed period to deliver a quality product. The objective of the paper is to report on our experience in moving from Waterfall process to Agile process in conducting the software engineering capstone project. We present the capstone course designs for both Waterfall driven and Agile driven methodologies that highlight the structure, deliverables and assessment plans.To evaluate the improvement, we conducted a survey for two different sections taught by two different instructors to evaluate students’ experience in moving from traditional Waterfall model to Agile like process. Twentyeight students filled the survey. The survey consisted of eight multiple-choice questions and an open-ended question to collect feedback from students. The survey results show that students were able to attain hands one experience, which simulate a real world-working environment. The results also show that the Agile approach helped students to have overall better design and avoid mistakes they have made in the initial design completed in of the first phase of the capstone project. In addition, they were able to decide on their team capabilities, training needs and thus learn the required technologies earlier which is reflected on the final product quality
PROMOTING STUDENT ENGAGEMENT USING SOCIAL MEDIA TECHNOLOGIEScscpconf
Using social media in education provides learners with an informal way for communication. Informal communication tends to remove barriers and hence promotes student engagement. This paper presents our experience in using three different social media technologies in teaching software project management course. We conducted different surveys at the end of every semester to evaluate students’ satisfaction and engagement. Results show that using social media enhances students’ engagement and satisfaction. However, familiarity with the tool is an important factor for student satisfaction.
A SURVEY ON QUESTION ANSWERING SYSTEMS: THE ADVANCES OF FUZZY LOGICcscpconf
In real world computing environment with using a computer to answer questions has been a human dream since the beginning of the digital era, Question-answering systems are referred to as intelligent systems, that can be used to provide responses for the questions being asked by the user based on certain facts or rules stored in the knowledge base it can generate answers of questions asked in natural , and the first main idea of fuzzy logic was to working on the problem of computer understanding of natural language, so this survey paper provides an overview on what Question-Answering is and its system architecture and the possible relationship and
different with fuzzy logic, as well as the previous related research with respect to approaches that were followed. At the end, the survey provides an analytical discussion of the proposed QA models, along or combined with fuzzy logic and their main contributions and limitations.
DYNAMIC PHONE WARPING – A METHOD TO MEASURE THE DISTANCE BETWEEN PRONUNCIATIONS cscpconf
Human beings generate different speech waveforms while speaking the same word at different times. Also, different human beings have different accents and generate significantly varying speech waveforms for the same word. There is a need to measure the distances between various words which facilitate preparation of pronunciation dictionaries. A new algorithm called Dynamic Phone Warping (DPW) is presented in this paper. It uses dynamic programming technique for global alignment and shortest distance measurements. The DPW algorithm can be used to enhance the pronunciation dictionaries of the well-known languages like English or to build pronunciation dictionaries to the less known sparse languages. The precision measurement experiments show 88.9% accuracy.
INTELLIGENT ELECTRONIC ASSESSMENT FOR SUBJECTIVE EXAMS cscpconf
In education, the use of electronic (E) examination systems is not a novel idea, as Eexamination systems have been used to conduct objective assessments for the last few years. This research deals with randomly designed E-examinations and proposes an E-assessment system that can be used for subjective questions. This system assesses answers to subjective questions by finding a matching ratio for the keywords in instructor and student answers. The matching ratio is achieved based on semantic and document similarity. The assessment system is composed of four modules: preprocessing, keyword expansion, matching, and grading. A survey and case study were used in the research design to validate the proposed system. The examination assessment system will help instructors to save time, costs, and resources, while increasing efficiency and improving the productivity of exam setting and assessments.
TWO DISCRETE BINARY VERSIONS OF AFRICAN BUFFALO OPTIMIZATION METAHEURISTICcscpconf
African Buffalo Optimization (ABO) is one of the most recent swarms intelligence based metaheuristics. ABO algorithm is inspired by the buffalo’s behavior and lifestyle. Unfortunately, the standard ABO algorithm is proposed only for continuous optimization problems. In this paper, the authors propose two discrete binary ABO algorithms to deal with binary optimization problems. In the first version (called SBABO) they use the sigmoid function and probability model to generate binary solutions. In the second version (called LBABO) they use some logical operator to operate the binary solutions. Computational results on two knapsack problems (KP and MKP) instances show the effectiveness of the proposed algorithm and their ability to achieve good and promising solutions.
DETECTION OF ALGORITHMICALLY GENERATED MALICIOUS DOMAINcscpconf
In recent years, many malware writers have relied on Dynamic Domain Name Services (DDNS) to maintain their Command and Control (C&C) network infrastructure to ensure a persistence presence on a compromised host. Amongst the various DDNS techniques, Domain Generation Algorithm (DGA) is often perceived as the most difficult to detect using traditional methods. This paper presents an approach for detecting DGA using frequency analysis of the character distribution and the weighted scores of the domain names. The approach’s feasibility is demonstrated using a range of legitimate domains and a number of malicious algorithmicallygenerated domain names. Findings from this study show that domain names made up of English characters “a-z” achieving a weighted score of < 45 are often associated with DGA. When a weighted score of < 45 is applied to the Alexa one million list of domain names, only 15% of the domain names were treated as non-human generated.
GLOBAL MUSIC ASSET ASSURANCE DIGITAL CURRENCY: A DRM SOLUTION FOR STREAMING C...cscpconf
The amount of piracy in the streaming digital content in general and the music industry in specific is posing a real challenge to digital content owners. This paper presents a DRM solution to monetizing, tracking and controlling online streaming content cross platforms for IP enabled devices. The paper benefits from the current advances in Blockchain and cryptocurrencies. Specifically, the paper presents a Global Music Asset Assurance (GoMAA) digital currency and presents the iMediaStreams Blockchain to enable the secure dissemination and tracking of the streamed content. The proposed solution provides the data owner the ability to control the flow of information even after it has been released by creating a secure, selfinstalled, cross platform reader located on the digital content file header. The proposed system provides the content owners’ options to manage their digital information (audio, video, speech, etc.), including the tracking of the most consumed segments, once it is release. The system benefits from token distribution between the content owner (Music Bands), the content distributer (Online Radio Stations) and the content consumer(Fans) on the system blockchain.
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEMcscpconf
This paper discusses the importance of verb suffix mapping in Discourse translation system. In
discourse translation, the crucial step is Anaphora resolution and generation. In Anaphora
resolution, cohesion links like pronouns are identified between portions of text. These binders
make the text cohesive by referring to nouns appearing in the previous sentences or nouns
appearing in sentences after them. In Machine Translation systems, to convert the source
language sentences into meaningful target language sentences the verb suffixes should be
changed as per the cohesion links identified. This step of translation process is emphasized in
the present paper. Specifically, the discussion is on how the verbs change according to the
subjects and anaphors. To explain the concept, English is used as the source language (SL) and
an Indian language Telugu is used as Target language (TL)
EXACT SOLUTIONS OF A FAMILY OF HIGHER-DIMENSIONAL SPACE-TIME FRACTIONAL KDV-T...cscpconf
In this paper, based on the definition of conformable fractional derivative, the functional
variable method (FVM) is proposed to seek the exact traveling wave solutions of two higherdimensional
space-time fractional KdV-type equations in mathematical physics, namely the
(3+1)-dimensional space–time fractional Zakharov-Kuznetsov (ZK) equation and the (2+1)-
dimensional space–time fractional Generalized Zakharov-Kuznetsov-Benjamin-Bona-Mahony
(GZK-BBM) equation. Some new solutions are procured and depicted. These solutions, which
contain kink-shaped, singular kink, bell-shaped soliton, singular soliton and periodic wave
solutions, have many potential applications in mathematical physics and engineering. The
simplicity and reliability of the proposed method is verified.
AUTOMATED PENETRATION TESTING: AN OVERVIEWcscpconf
The using of information technology resources is rapidly increasing in organizations,
businesses, and even governments, that led to arise various attacks, and vulnerabilities in the
field. All resources make it a must to do frequently a penetration test (PT) for the environment
and see what can the attacker gain and what is the current environment's vulnerabilities. This
paper reviews some of the automated penetration testing techniques and presents its
enhancement over the traditional manual approaches. To the best of our knowledge, it is the
first research that takes into consideration the concept of penetration testing and the standards
in the area.This research tackles the comparison between the manual and automated
penetration testing, the main tools used in penetration testing. Additionally, compares between
some methodologies used to build an automated penetration testing platform.
CLASSIFICATION OF ALZHEIMER USING fMRI DATA AND BRAIN NETWORKcscpconf
Since the mid of 1990s, functional connectivity study using fMRI (fcMRI) has drawn increasing
attention of neuroscientists and computer scientists, since it opens a new window to explore
functional network of human brain with relatively high resolution. BOLD technique provides
almost accurate state of brain. Past researches prove that neuro diseases damage the brain
network interaction, protein- protein interaction and gene-gene interaction. A number of
neurological research paper also analyse the relationship among damaged part. By
computational method especially machine learning technique we can show such classifications.
In this paper we used OASIS fMRI dataset affected with Alzheimer’s disease and normal
patient’s dataset. After proper processing the fMRI data we use the processed data to form
classifier models using SVM (Support Vector Machine), KNN (K- nearest neighbour) & Naïve
Bayes. We also compare the accuracy of our proposed method with existing methods. In future,
we will other combinations of methods for better accuracy.
VALIDATION METHOD OF FUZZY ASSOCIATION RULES BASED ON FUZZY FORMAL CONCEPT AN...cscpconf
In order to treat and analyze real datasets, fuzzy association rules have been proposed. Several
algorithms have been introduced to extract these rules. However, these algorithms suffer from
the problems of utility, redundancy and large number of extracted fuzzy association rules. The
expert will then be confronted with this huge amount of fuzzy association rules. The task of
validation becomes fastidious. In order to solve these problems, we propose a new validation
method. Our method is based on three steps. (i) We extract a generic base of non redundant
fuzzy association rules by applying EFAR-PN algorithm based on fuzzy formal concept analysis.
(ii) we categorize extracted rules into groups and (iii) we evaluate the relevance of these rules
using structural equation model.
PROBABILITY BASED CLUSTER EXPANSION OVERSAMPLING TECHNIQUE FOR IMBALANCED DATAcscpconf
In many applications of data mining, class imbalance is noticed when examples in one class are
overrepresented. Traditional classifiers result in poor accuracy of the minority class due to the
class imbalance. Further, the presence of within class imbalance where classes are composed of
multiple sub-concepts with different number of examples also affect the performance of
classifier. In this paper, we propose an oversampling technique that handles between class and
within class imbalance simultaneously and also takes into consideration the generalization
ability in data space. The proposed method is based on two steps- performing Model Based
Clustering with respect to classes to identify the sub-concepts; and then computing the
separating hyperplane based on equal posterior probability between the classes. The proposed
method is tested on 10 publicly available data sets and the result shows that the proposed
method is statistically superior to other existing oversampling methods.
CHARACTER AND IMAGE RECOGNITION FOR DATA CATALOGING IN ECOLOGICAL RESEARCHcscpconf
Data collection is an essential, but manpower intensive procedure in ecological research. An
algorithm was developed by the author which incorporated two important computer vision
techniques to automate data cataloging for butterfly measurements. Optical Character
Recognition is used for character recognition and Contour Detection is used for imageprocessing.
Proper pre-processing is first done on the images to improve accuracy. Although
there are limitations to Tesseract’s detection of certain fonts, overall, it can successfully identify
words of basic fonts. Contour detection is an advanced technique that can be utilized to
measure an image. Shapes and mathematical calculations are crucial in determining the precise
location of the points on which to draw the body and forewing lines of the butterfly. Overall,
92% accuracy were achieved by the program for the set of butterflies measured.
SOCIAL NETWORK HATE SPEECH DETECTION FOR AMHARIC LANGUAGEcscpconf
The anonymity of social networks makes it attractive for hate speech to mask their criminal
activities online posing a challenge to the world and in particular Ethiopia. With this everincreasing
volume of social media data, hate speech identification becomes a challenge in
aggravating conflict between citizens of nations. The high rate of production, has become
difficult to collect, store and analyze such big data using traditional detection methods. This
paper proposed the application of apache spark in hate speech detection to reduce the
challenges. Authors developed an apache spark based model to classify Amharic Facebook
posts and comments into hate and not hate. Authors employed Random forest and Naïve Bayes
for learning and Word2Vec and TF-IDF for feature selection. Tested by 10-fold crossvalidation,
the model based on word2vec embedding performed best with 79.83%accuracy. The
proposed method achieve a promising result with unique feature of spark for big data.
GENERAL REGRESSION NEURAL NETWORK BASED POS TAGGING FOR NEPALI TEXTcscpconf
This article presents Part of Speech tagging for Nepali text using General Regression Neural
Network (GRNN). The corpus is divided into two parts viz. training and testing. The network is
trained and validated on both training and testing data. It is observed that 96.13% words are
correctly being tagged on training set whereas 74.38% words are tagged correctly on testing
data set using GRNN. The result is compared with the traditional Viterbi algorithm based on
Hidden Markov Model. Viterbi algorithm yields 97.2% and 40% classification accuracies on
training and testing data sets respectively. GRNN based POS Tagger is more consistent than the
traditional Viterbi decoding technique.
APPLYING DISTRIBUTIONAL SEMANTICS TO ENHANCE CLASSIFYING EMOTIONS IN ARABIC T...cscpconf
Most of the recent researches have been carried out to analyse sentiment and emotions found in
English texts, where few studies have been conducted on Arabic contents, which have been
focused on analysing the sentiment as positive and negative, instead of the different emotions’
classes. Therefore this paper has focused on analysing different six emotions’ classes in Arabic
contents, especially Arabic tweets which have unstructured nature that make it challenging task
compared to the formal structured contents found in Arabic journals and books. On the other
hand, the recent developments in the distributional sematic models, have encouraged testing the
effect of the distributional measures on the classification process, which was not investigated by
any other classification-related studies for analysing Arabic texts. As a result, the model has
successfully improved the average accuracy to more than 86% using Support Vector Machine
(SVM) compared to the different sentiments and emotions studies for classifying Arabic texts
through the developed semi-supervised approach which has employed the contextual and the
co-occurrence information from a large amount of unlabelled dataset. In addition to the
different remarkable achieved results, the model has recorded a high average accuracy,
85.30%, after removing the labels from the unlabelled contextual information which was used in
the labelled dataset during the classification process. Moreover, due to the unstructured nature
of Twitter contents, a general set of pre-processing techniques for Arabic texts was found which
has resulted in increasing the accuracy of the six emotions’ classes to 85.95% while employing
the contextual information from the unlabelled dataset.
Welcome to TechSoup New Member Orientation and Q&A (May 2024).pdfTechSoup
In this webinar you will learn how your organization can access TechSoup's wide variety of product discount and donation programs. From hardware to software, we'll give you a tour of the tools available to help your nonprofit with productivity, collaboration, financial management, donor tracking, security, and more.
This is a presentation by Dada Robert in a Your Skill Boost masterclass organised by the Excellence Foundation for South Sudan (EFSS) on Saturday, the 25th and Sunday, the 26th of May 2024.
He discussed the concept of quality improvement, emphasizing its applicability to various aspects of life, including personal, project, and program improvements. He defined quality as doing the right thing at the right time in the right way to achieve the best possible results and discussed the concept of the "gap" between what we know and what we do, and how this gap represents the areas we need to improve. He explained the scientific approach to quality improvement, which involves systematic performance analysis, testing and learning, and implementing change ideas. He also highlighted the importance of client focus and a team approach to quality improvement.
Palestine last event orientationfvgnh .pptxRaedMohamed3
An EFL lesson about the current events in Palestine. It is intended to be for intermediate students who wish to increase their listening skills through a short lesson in power point.
The Indian economy is classified into different sectors to simplify the analysis and understanding of economic activities. For Class 10, it's essential to grasp the sectors of the Indian economy, understand their characteristics, and recognize their importance. This guide will provide detailed notes on the Sectors of the Indian Economy Class 10, using specific long-tail keywords to enhance comprehension.
For more information, visit-www.vavaclasses.com
The Art Pastor's Guide to Sabbath | Steve ThomasonSteve Thomason
What is the purpose of the Sabbath Law in the Torah. It is interesting to compare how the context of the law shifts from Exodus to Deuteronomy. Who gets to rest, and why?
Operation “Blue Star” is the only event in the history of Independent India where the state went into war with its own people. Even after about 40 years it is not clear if it was culmination of states anger over people of the region, a political game of power or start of dictatorial chapter in the democratic setup.
The people of Punjab felt alienated from main stream due to denial of their just demands during a long democratic struggle since independence. As it happen all over the word, it led to militant struggle with great loss of lives of military, police and civilian personnel. Killing of Indira Gandhi and massacre of innocent Sikhs in Delhi and other India cities was also associated with this movement.
Unit 8 - Information and Communication Technology (Paper I).pdfThiyagu K
This slides describes the basic concepts of ICT, basics of Email, Emerging Technology and Digital Initiatives in Education. This presentations aligns with the UGC Paper I syllabus.
The French Revolution, which began in 1789, was a period of radical social and political upheaval in France. It marked the decline of absolute monarchies, the rise of secular and democratic republics, and the eventual rise of Napoleon Bonaparte. This revolutionary period is crucial in understanding the transition from feudalism to modernity in Europe.
For more information, visit-www.vavaclasses.com
2. 58 Computer Science & Information Technology (CS & IT)
Big data analytics is a recent technology that has an immense potential to permit comprehending
city data and, hence, enhancing smart city services. Effective management and analysis of big
data is a fundamental component to achieve the goals of the smart city. These goals include
tackling the problems, reducing resources consumption and costs, engaging actively with citizens,
and making informed decisions that will result in enhancing the environment and improving
economic outcomes leading to improved quality of urban life. For the big data scientist, there is
opportunity amongst this vast amount and array of data. The analysis of big data has the potential
to improve the lives of citizens and reduce costs by uncovering associations and by understanding
the trends and patterns in the data.
In this work, we propose to collect and analyze social media data concerning the main cities in
the UAE. An increasing number of events is taking place every year in these cities. Therefore, it
is paramount to analyze the conversations of the citizens with regards to these events and other
concerns. First, we focus on data retrieved from Twitter given the small size of the tweets.
Twitter is an exciting source of information for real-time event detection and sentiment analysis.
Twitter API is used to get up-to-date tweets. As social media analytics uses several analysis and
modeling techniques, we focus primarily on techniques such as sentiment and trend analysis
that support the data understanding phase. This quest will contribute to a better understanding of
the needs and concerns of the public so that event organizers and municipal governments take
appropriate action to address these concerns.
The remainder of the paper is organized as follows. Section 2 provides background information
on the benefits of big data analytics in smart cities and the application of social media analytics
for events’ detection. Section 3 describes the methodology followed to analyze social media
data streams. Section 4 describes a case scenario in which tweets concerning the UAE are
analyzed by applying sentiment analysis. Finally, Section 5 concludes the paper and highlights
future work.
2. BACKGROUND AND LITERATURE REVIEW
2.1 Big data analytics applications
Using advanced analytics techniques such as data mining, machine learning, statistical learning,
text analytics, predictive analytics, and visualization tools, city stakeholders and local
governments would be able to analyze previously inaccessible or unusable data to gain new
insights resulting in significantly faster and informed decisions. These techniques can speed up
the analytical investigation, leading to insights from both traditional and non-traditional data
sources. Ideally, city and local government would use data analytics to monitor public utilities,
alleviate traffic congestion, assess and anticipate crime, follow education trends, and carefully
watch public resources.
Another smart city application where big data analytics will play a vital role is crowd control.
Indeed, the cities are more and more crowded, and many events are organized with as many
people attending the events. As a result, municipalities should provide many services to the
participants such as safety, mobilizing police and emergency responders, basic needs such as
food and beverages, and many more. Crowds of people mean that massive amounts of data are
generated. Big data analytics can be used to predict the movement of the masses to avoid
jostling or other severe disasters.
3. Computer Science & Information Technology (CS & IT) 59
During emergencies, cities equipped with sensors can take advantage of collected data to make
informed decisions. Using social media analytics during emergency response is of particular
interest. Social media networks offer data streams, which can be used to collect near real-time
information concerning an emergency. Though people post on social networks many unrelated
messages, any emergency information can be valuable to emergency response teams and can
help them to get a good picture of the situation, permitting a more effective and faster response
that can reduce overall loss and damage [2].
2.2 Social media analytics
Social network sites, such as Facebook, Twitter, and Google+, have become so popular that
they represent a new source of real-time information concerning various topics and events. The
availability of social networks sites on numerous devices ranging from personal computers to
tablets and smartphones contributed to their popularity [3][4].
People typically use social media networks to post small messages that allow them to express
their opinions on a variety of topics or to report events occurring in their vicinity. These
networks allow their users to have an identity, build small online communities, find other users
with similar interests, and find content published by other users [5]. Shared messages on these
networks are called Status Update Messages (SUM). In addition to the text of the message, a
SUM contains metadata information such as the name of the user, timestamp, geographic
coordinates (latitude and longitude), hashtags, and links to other resources.
The SUMs originating from users is a specific geographical area, such as a city, or discussing a
topic or raising a concern can offer valuable information on an issue or event that can help
decision-makers make informed decisions. As a result, social network users are considered as
social sensors [6][7], and SUMs as sensor information [8].
Social media analysis is about collecting data from social media websites and blogs and
processing that data as structured information that can lead decision-makers to make
information-driven decisions. The customer is at the center of these decisions. The most
common use of social media analytics is to leverage customers' opinions about products and
services to support customer service and marketing activities.
2.3 Detection of events from social media analytics
In recent years, social networks have become a new source of information that municipal
authorities can use to detect events, such as traffic jams, and get reports on incidents and natural
disasters (fires, storms, tremors etc.) in their neighborhood. An event is a real-world occurrence
that happens at a given moment and space [3] [9]. In particular, regarding traffic-related
incidents, people often share using SUM information about the current traffic situation around
them while driving.
However, event detection from the analysis of social networks’ short-messages is more
challenging compared with event detection from conventional media, such as emails and blogs,
where texts are well-formatted [4]. SUMs are unstructured and irregular texts, which might
contain simple or abbreviated words, grammatical errors or misspellings [3]. They are usually
very brief, which makes them an incomplete source of information [4]. Moreover, SUMs
4. 60 Computer Science & Information Technology (CS & IT)
contain a massive amount of meaningless or not useful information [10], which require filtering.
To extract meaningful information from social media networks, it is necessary to use text
mining techniques that use methods in the fields of Natural Language Processing (NLP),
machine learning, and data mining. [11] [12].
Concerning current approaches for using social media to obtain useful information for event
detection, there is a need to distinguish between large-scale events and small-scale events.
Large-scale events such as the election of a president, earthquakes or the tsunami are generally
characterized by a considerable number of SUMs and a wider temporal and geographic
coverage. Instead, small events such as traffic jams, car accidents, fires or local shows are
characterized by a small number of associated SUMs, limited geographic and temporal coverage
[21]. As a result, small-scale event detection is not a trivial task because of the smaller number
of SUMs related to these kinds of events. Several research efforts studied event detection from
social networks data. Many of them deal with large-scale event detection [8][13][14][15][16]
and only a few of these works investigated small-scale events [17][18][19][20].
3. METHODOLOGY
A three-phase process, which includes data capture, data understanding, and presentation is used
in this work (see Figure 1).
Data Capture. This phase helps identify messages on social media networks about the city’s
activities, events, and concerns. This process is achieved by collecting massive amounts of
pertinent data from social media networks using primarily APIs, provided by these sources, or
through crawling. The main popular social media platforms are Twitter, Facebook, Google+,
LinkedIn, and YouTube. Some preprocessing steps may be performed to prepare a dataset for
the data understanding phase. They typically include modeling and data representation,
syntactic and semantic operations, linking data originating from different sources, and feature
extraction.
Data understanding. In this phase, the meaning of collected data in the previous phase is
assessed and metrics useful for decision-making are generated. As collected data might
originate from multiple sources, it might contain a significant amount of noise that need to be
cleaned before conducting meaningful analyses. This cleaning function may use simple, rule-
based text classifiers or more sophisticated classifiers trained on labeled data. To assess the
meaning of cleaned data, various techniques such as text mining, natural language processing,
social network analysis, and statistical methods can be used. This phase can provide information
about user opinions concerning an event or a service. Its results significantly impact the
information and metrics in the presentation phase as well as the future decisions and actions a
smart city might take.
Presentation. In this last phase, the results of different analyzes are summarized, evaluated and
presented to users in an easy-to-understand format. The visual dashboard, which aggregates and
displays data from multiple sources, is a commonly used interface design.
5. Computer Science & Information Technology (CS & IT) 61
Figure 1. Overview of the proposed method
4. CASE SCENARIO
In this scenario, we implement a real-time application to get the latest Twitter feeds concerning
the #Dubai, #AbuDhabi, and #Sharjah hashtags and process them by subjecting them to sentiment
analysis.
Data streams (tweets) from Twitter have been recognized as a valuable data source for many
smart cities in areas such as law enforcement, tourism, and politics (e.g., US presidential
election). The Twitter Streaming API permits extracting datasets that are then used to perform
sentiment analysis.
The following workflow describes the different steps we used for the analysis of the tweets:
1. Authenticate and connect to Twitter using Twitter API
2. Use search REST API to gather tweets from an Input file using keywords, hashtags, and
Twitter account.
3. Store collected tweets in a CSV file
4. Preprocess the gathered tweets by removing duplicates
5. Use TextBlob to obtain sentiment for the unique tweets. TextBlob is a Python library for
processing textual data. The result of a sentiment analysis task is the percentage of
positive, negative, and neutral opinions.
6. Compute total, positive, negative and neutral tweets over a number of days.
7. Visualize results
6. 62 Computer Science & Information Technology (CS & IT)
Figure 2.A workflow for the sentiment analysis of tweets
Figure 3 depicts the results obtained by analysing the opinions of the users regarding the traffic in
the UAE main cities: Dubai, Abu Dhabi, and Sharjah over a week period from March 7, 2018, to
March 12, 2018.
Figure 3. Opinions of the users regarding the traffic in the UAE main cities
7. Computer Science & Information Technology (CS & IT) 63
The results show that a significant proportion of users have a negative opinion on the traffic in the
three large UAE cities with more negative views on the traffic in Dubai. The percentage of users
with a positive opinion is relatively low compared with negative opinions in the three cities.
However, for Sharjah, that percentage is a little bit higher than the percentage of neutral opinions,
which is not the case for Dubai and Abu Dhabi. These results indicate that users are suffering
from the traffic conditions in the three cities and that municipal services should work hard to find
solutions to the traffic problem, which is one of the challenges that most megacities are facing.
5. CONCLUSION
Modern cities are more and more relying on the usage of the Internet of Things (IoT) devices and
sensors to sense various parameters such as temperature humidity, water leaks, sunlight, and air
pressure. These devices and sensors generate massive volumes of data. Also, social networks are
becoming a new source of real-time information in smart cities. Social network users are acting as
social sensors. In this work, we described how social media analytics could help analysing urban
data streams collected from popular social media sources, such as Twitter and Facebook, to detect
events taking place in a smart city and identify the concerns of citizens regarding some events or
issues. We analyse in a case scenario the sentiments of users concerning the traffic in three largest
cities in the UAE. The results show how frustrated are the users with the traffic in the cities of
Dubai and Abu Dhabi that their municipalities need to work hard to alleviate this issue.
REFERENCES
[1] I. A. T. Hashem, V. Chang, N. B. Anuar, K. Adewole, I. Yaqoob, A. Gani, E. Ahmed, and H.
Chiroma, (2016) “The role of big data in smart city,” International Journal of Information
Management, vol. 36, no. 5, pp. 748–758.
[2] Tim L. M. van Kasteren, Birte Ulrich, Vignesh Srinivasan, Maria E. Niessen, (2014) “Analyzing
Tweets to Aid Situational Awareness,” Advances in Information Retrieval, Vol. 8416, Lecture Notes
in Computer Science, pp 700-705.
[3] F. Atefeh and W. Khreich, (2015) “A survey of techniques for event detection in Twitter,” Comput.
Intell., vol. 31, no. 1, pp. 132–164.
[4] P. Ruchi and K. Kamalakar, (2013) “ET: Events from tweets,” in Proc. 22nd Int. Conf. World Wide
Web Comput., Rio de Janeiro, Brazil, pp. 613–620.
[5] A. Mislove, M. Marcon, K. P. Gummadi, P. Druschel, and B. Bhattacharjee, (2007) “Measurement
and analysis of online social networks,” in Proc. 7th ACM SIGCOMM Conf. Internet Meas., San
Diego, CA, USA, pp. 29–42.
[6] G. Anastasi et al., (2013) “Urban and social sensing for sustainable mobility in smart cities,” in Proc.
IFIP/IEEE Int. Conf. Sustainable Internet ICT Sustainability, Palermo, Italy, pp. 1–4.
[7] A. Rosi et al., (2011) “Social sensors and pervasive services: Approaches and perspectives,” in Proc.
IEEE Int. Conf. PERCOM Workshops, Seattle, WA, USA, pp. 525–530.
[8] T. Sakaki, M. Okazaki, and Y. Matsuo, (2013) “Tweet analysis for real-time event detection and
earthquake reporting system development,” IEEE Trans. Knowl. Data Eng., vol. 25, no. 4, pp. 919–
931.
8. 64 Computer Science & Information Technology (CS & IT)
[9] J. Allan, (2002) “Topic Detection and Tracking: Event-Based Information Organization”. Norwell,
MA, USA: Kluwer.
[10] J. Hurlock and M. L. Wilson, (2011) “Searching Twitter: Separating the tweet from the chaff,” in
Proc. 5th AAAI ICWSM, Barcelona, Spain, pp. 161–168.
[11] S. Weiss, N. Indurkhya, T. Zhang, and F. Damerau, (2004) “Text Mining: Predictive Methods for
Analyzing Unstructured Information,” Berlin, Germany: Springer-Verlag.
[12] Hotho, A. Nürnberger, and G. Paaß, (2005) “A brief survey of text mining,” LDV Forum-GLDV J.
Comput. Linguistics Lang. Technol., vol. 20, no. 1, pp. 19–62.
[13] M. Krstajic, C. Rohrdantz, M. Hund, and A. Weiler, (2012) “Getting there first: Real-time detection
of real-world incidents on Twitter” in Proc. 2nd IEEE Work Interactive Vis. Text Anal.—Task-
Driven Anal. Soc. Media IEEE VisWeek,” Seattle, WA, USA.
[14] C. Chew and G. Eysenbach, (2010) “Pandemics in the age of Twitter: Content analysis of tweets
during the 2009 H1N1 outbreak,” PLoS ONE, vol. 5, no. 11, pp. 1–13.
[15] B. De Longueville, R. S. Smith, and G. Luraschi, (2009) “OMG, from here, I can see the flames!: A
use case of mining location based social networks to acquire spatiotemporal data on forest fires,” in
Proc. Int. Work. LBSN, Seattle, WA, USA, pp. 73–80.
[16] J. Yin, A. Lampert, M. Cameron, B. Robinson, and R. Power, (2012) “Using social media to enhance
emergency situation awareness,” IEEE Intell. Syst., vol. 27, no. 6, pp. 52–59.
[17] T. Sakaki, Y. Matsuo, T. Yanagihara, N. P. Chandrasiri, and K. Nawa, (2012) “Real-time event
extraction for driving information from social sensors,” in Proc. IEEE Int. Conf. CYBER, Bangkok,
Thailand, pp. 221–226.
[18] P. Agarwal, R. Vaithiyanathan, S. Sharma, and G. Shro, (2012) “Catching the long-tail: Extracting
local news events from Twitter,” in Proc. 6th AAAI ICWSM, Dublin, Ireland, Jun. pp. 379–382.
[19] F. Abel, C. Hauff, G.-J. Houben, R. Stronkman, and K. Tao, (2012) “Twitcident: fighting fire with
information from social web streams,” in Proc. ACM 21st Int. Conf. Comp. WWW, Lyon, France, pp.
305–308.
[20] R. Li, K. H. Lei, R. Khadiwala, and K. C.-C. Chang, (2012) “TEDAS: A Twitter- based event
detection and analysis system,” in Proc. 28th IEEE ICDE, Washington, DC, USA, pp. 1273–1276.
[21] A. Schulz, P. Ristoski, and H. Paulheim, ( 2013) “I see a car crash: Real-time detection of small scale
incidents in microblogs,” in The Semantic Web: ESWC 2013 Satellite Events, vol. 7955. Berlin,
Germany: Springer-Verlag, pp. 22–33.
[22] TextBlob. TextBlob: Simplified Text Processing. http://textblob.readthedocs.io/en/dev/