SlideShare a Scribd company logo
1 of 67
NATIONAL AND KAPODISTRIAN UNIVERSITY OF ATHENS
FACULTY OF SCIENCES
DEPARTMENT OF INFORMATICS AND TELECOMMUNICATIONS
BACHELOR THESIS
Developing a sensing framework for mobile devices
Alina J. Gabaraeva
Iliona D. Iliadhi
Supervisors: Alonistioti Athanasia, Assistant Professor
Dimitrios Soukaras , Scientific Associate
ΑTHENS
March 2015
ΕΘΝΙΚΟ ΚΑΙ ΚΑΠΟΔΙΣΤΡΙΑΚΟ ΠΑΝΕΠΙΣΤΗΜΙΟ ΑΘΗΝΩΝ
ΣΧΟΛΗ ΘΕΤΙΚΩΝ ΕΠΙΣΤΗΜΩΝ
ΤΜΗΜΑ ΠΛΗΡΟΦΟΡΙΚΗΣ ΚΑΙ ΤΗΛΕΠΙΚΟΙΝΩΝΙΩΝ
ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ
Ανάπτυξη ενός πλαισίου ανίχνευσης αισθητήρων για κινητές
συσκευές
Alina J. Gabaraeva
Iliona D. Iliadhi
Επιβλέποντες: Αλωνιστιώτη Αθανασία , Επίκουρος Καθηγητής
Δημήτριος Σουκαράς, Επιστημονικός Συνεργάτης
ΑΘΗΝΑ
Μάρτιος 2015
BACHELOR THESIS
Developing a sensing framework for mobile devices
Iliona D. Iliadhi
Α.Μ.: 1115200900180
Alina J. Gabaraeva
Α.Μ.: 1115200800222
SUPERVISORS: Alonistioti Athanasia, Assistant Professor
Dimitrios Soukaras, Scientific Associate
ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ
Ανάπτυξη ενός πλαισίου ανίχνευσης αισθητήρων για κινητές συσκευές
Iliona D. Iliadhi
Α.Μ.: 1115200900180
Alina J. Gabaraeva
Α.Μ.: 1115200800222
ΕΠΙΒΛΕΠΟΝΤΕΣ: Αλωνιστιώτη Αθανασία, Επίκουρος Καθηγητής
Δημήτριος Σουκαράς, Επιστημονικός Συνεργάτης
ABSTRACT
IoT is an acronym for Internet of Things. Much of the Internet of Things (IoT) is mobile,
and compact sensors play a key role in delivering status information for the connected
devices. For this reason Internet of Things can be thought of as the Internet of sensors
and actuators.
This paper describes the development of a sensing and data processing framework for
accessing sensor values on Android mobile devices and making them available to other
applications. Our goal is to bring the sensing capabilities of smartphone devices to IoT
architectures.
Implementing this mobile sensing framework we intend to facilitate the interface
between the IoT system, the user application and device hardware drivers. This work
will help developers to focus on writing minimal pieces of sensor-specific code enabling
an ecosystem of reusable sensor drivers. We describe the challenges and requirements
for such a framework, outline an architecture and report on our experience with an
implementation running on top of the Android platform.
SUBJECT AREA: Internet of Things
KEYWORDS: phone sensing, framework, sensors, android mobile computing
ΠΕΡΙΛΗΨΗ
IoT είναι ένα αρκτικόλεξο για το Ίντερνετ των πραγμάτων. Μεγάλο μέρος του Ίντερνετ
των πραγμάτων (IoT) είναι τα κινητά, κυρίως οι ενσωματομένοι αισθητήρες έχουν το
σημαντικότερο ρόλο στην παροχή πληροφοριών για την κατάσταση των συνδεδεμένων
συσκευών. Για το λόγο αυτό το Ίντερνετ των πραγμάτων μπορεί να θεωρηθεί ως το
Διαδίκτυο των αισθητήρων.
Αυτή η πτυχιακή εργασία περιγράφει την ανάπτυξη ενός πλαισίου ανίχνευσης και
επεξεργασίας δεδομένων και την πρόσβασή τους από φορητές συσκευές Android ώστε
να χρησιμοποιηθούν σε άλλες εφαρμογές. Στόχος μας είναι να φέρουμε τις δυνατότητες
ανίχνευσης των έξυπνων κινητών συσκευών σε IoT αρχιτεκτονικές.
Με την υλοποιήση αυτού του προγράμματος σκοπεύουμε να διευκολυνθεί η σύνδεση
μεταξύ των IoT συστημάτων, την διεπαφή του χρήστη και του υλικού κινιητών
συσκευών. Το πλαίσιο αυτό θα βοηθήσει τους προγραμματιστές στο γράψιμο ελάχιστου
κώδικα σχετικά με τους αισθητήρες έτσι ώστε να είναι δυνατή η επαναχρησιμοποιήσή
τους.
Περιγράφοντας τις προκλήσεις αλλα και τις απαιτήσεις για ένα τέτοιο πλαίσιο,
σκιαγράφουμε μία αρχιτεκτονική και αναφέρουμε την εμπειρία μας, με την εφαρμογή
αυτή που στηρίζεται στην βάση της Android πλατφόρμας.
ΘΕΜΑΤΙΚΕΣ ΠΕΡΙΟΧΕΣ: Ίντερνέτ των πραγμάτων
ΛΕΞΕΙΣ ΚΛΕΙΔΙΑ: πλαίσιο ανίχνευσης, αισθητήρες, κινητά, ανάπτυξη λογισμικού
Android
To my loved ones.
-Alina
To my family that supported me in this journey.
-Iliona
AKNOWLEDGMENTS
For the fulfillment of this thesis, we would like to thank our supervisors, Ms. Nancy
Alonistioti and Mr. Dimitrios Soukara, for their precious help and guidance. Their
constructive comments and their assistance in the design of the framework were very
valuable for our thesis.
Για τη διεκπεραίωση της παρούσας Πτυχιακής Εργασίας, θα θέλαμε να ευχαριστήσουμε
τους επιβλέποντές μας, την κυρία Αθανασία Αλωνιστιώτη και τον κύριο Δημήτρη
Σουκαρά, για την βοήθειά και την καθοδήγησή τους. Τα εποικοδομητικά τους σχόλια
καθώς και η συνδρομή τους στο σχεδιασμό του πλαισίου ήταν πολύτιμα για την
πτυχιακή μας εργασία.
TABLE OF CONTENTS
PREFACE......................................................................................................................................13
1. INTRODUCTION..................................................................................................................14
1.1 Motivation ..........................................................................................................................................................................14
1.2 A brief summary of the subject ......................................................................................................................................14
1.3 Paper Roadmap..................................................................................................................................................................15
2. BACKGROUND .....................................................................................................................16
2.1 Overview of Internet of Things......................................................................................................................................16
2.1.1 IoT definition..............................................................................................................................................................16
2.1.2 Evolution of the Internet of Things ........................................................................................................................16
2.1.3 Use case scenarios and the future .........................................................................................................................18
2.2 IoT as an internet of sensors ...........................................................................................................................................19
2.3 Sensing with mobile phones ...........................................................................................................................................22
2.3.1 Smartphones as a sensing platform.......................................................................................................................22
2.3.2 What is a sensing framework? ................................................................................................................................24
2.3.3 IoT applications using mobile phone based sensing...........................................................................................25
3. ANDROID AS A MOBILE OPERATING SYSTEM...............................................................26
3.1 Android ................................................................................................................................................................................26
3.1.1 Background and history............................................................................................................................................26
3.1.2 Architecture................................................................................................................................................................26
3.1.3 Android SDK................................................................................................................................................................27
3.1.4 Application features..................................................................................................................................................28
3.2 Android and the Internet of Things ...............................................................................................................................29
3.3 Choosing Android as the Implementation Platform ..................................................................................................29
4. ANDROID SENSING FRAMEWORK...................................................................................31
4.1 Framework Description....................................................................................................................................................31
4.1.1 Core functionality - Overview..................................................................................................................................31
4.1.2 Implementation environment.................................................................................................................................32
4.2 SENSOROID Framework Fundamentals........................................................................................................................32
4.2.1 Framework components ..........................................................................................................................................32
4.2.2 Intents..........................................................................................................................................................................35
4.2.3 Context........................................................................................................................................................................36
4.2.4 Listeners ......................................................................................................................................................................36
4.2.5 Android Manifest.......................................................................................................................................................37
4.2.6 Sensors and Location................................................................................................................................................37
4.3 Development Requirements and Constraints .............................................................................................................43
4.4 Framework Architectural Design ...................................................................................................................................44
4.5 Framework Development Analysis................................................................................................................................44
4.5.1 Framework Interface ................................................................................................................................................45
4.5.2 Sensor Interface Implementation...........................................................................................................................45
4.5.3 Sensor Manager via Services...................................................................................................................................45
4.5.4 Gathering Sensor Data..............................................................................................................................................46
4.6 Data Flow Modulation......................................................................................................................................................48
4.7 Test and Extensibility........................................................................................................................................................51
4.7.1 Testing on the real device versus the simulator ..................................................................................................51
4.7.2 SENSOROID extensibility.........................................................................................................................................54
5. CONCLUSIONS......................................................................................................................55
ABBREVIATIONS AND ACRONYMS.........................................................................................56
APPENDIX A................................................................................................................................57
REFERENCES...............................................................................................................................66
TABLE OF DIAGRAMS
Diagram 1: Custom Data Listener Interface activity sequence diagram ...........................47
Diagram 2: SENSOROID class diagram ................................................................................48
Diagram 3: Start Sensor method activity sequence diagram .............................................49
Diagram 4: Stop Sensor method activity sequence diagram ..............................................50
TABLE OF FIGURES
Figure 1: Internet connected-devices and the future evolution (Source: Cisco 2011) ...16
Figure 2 : How a sensor works (from left to right), Source: Carré & Strauss.......................20
Figure 3 : Range of Sensors in IoT (Source: Reproduced from “Practical Electronics for
Inventors”, Paul Scherz and Simon Monk) ................................................................................21
Figure 4: Android Architecture (Source: blog.ameykelkar.com) .........................................27
Figure 5: Global smartphone operating system market .......................................................30
Figure 6: SENSOROID framework structure .........................................................................31
Figure 7: Android Activity Lifecycle (Source: Android Developers) ....................................33
Figure 8:Android Service lifecycle (Source: Android Developers)......................................34
Figure 9: SENSOROID implemented sensors....................Error! Bookmark not defined.
Figure 10: Accelerometer axes (Source: Android Developers)...........................................39
Figure 11: Android location components (Source: [12]) .......................................................42
Figure 12: SENSOROID architecture of all levels .................................................................44
Figure 13: Sensor Manager interaction with sensor hardware ...........................................46
Figure 14: SensorSimulator overview (Source: OpenIntents) .............................................51
Figure 15: real device testing UI ..............................................................................................52
Figure 16 : Main Activity Class Diagram.................................................................................53
Figure17: Phone Tester Application UI...................................................................................53
PREFACE
This text was written in Athens in 2015 to support our graduation thesis. The idea of the
subject discussed in this paper was given because of the need of implementing a
mobile sensing framework to use within the context of the Internet of Things.
Therefore, it was a challenge for us to make a research upon this innovative concept in
technology, especially given that there are insufficient research resources and
supporting materials about it.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 14
1. INTRODUCTION
1.1 Motivation
Mobile devices can create, share, and sync everything we want despite of distance
specifications. Smartphones are fast becoming a ubiquitous computing platform. The
statistics show that the total number of mobile phones shipped worldwide by the first
quarter of 2014 was over 448,6 million devices [IDE]. The worldwide smartphone
market grew 27.2% year over year in the second quarter of 2014. By 2017, 87% of the
worldwide smart, connected device market will be tablets and smartphone, with PCs
(both desktop and laptop) being 13% of the market [IDE].
These new mobile devices are programmable and come with a growing set of pre-
installed powerful embedded sensors with multiple abilities for detecting GPS positions,
directional accelerations, rotational vectors, device proximities, temperatures, ambient
light conditions, etc. These sensors provide context-aware solutions and help the
creation of a new level of sensor based applications in health, entertainment, access
control, security, energy efficiency, home monitoring and home care.
Smart phones, or any synchronous mobile device run on various mobile operating
systems, some of them even on two. The most common of them are Android, iOS for
Apple smartphones or tablets, Windows Phone, etc. With such variety of operating
systems come great the need to develop a generic framework that can retrieve the
values of various sensor types despite of the OS the smartphone is running.
In this paper, we are presenting SENSOROID, a new framework that provides context-
rich data that streams collected from android smartphones. Our architecture supports a
service model, built on the Android platform it can be used by Java developers for
integrating contextual data.
1.2 A brief summary of the subject
We live in a connected world. Nearly two billion people connect to the Internet, share
information and communicate over blogs, Wikis, social networks and a host of other
media. By 2020 there will be fifty billion units connected and this concept is known
globally as the “Internet of Things”. It involves people and objects being able to connect
to the Internet anytime and anywhere and share information about their behavior. If we
were to summarize the benefits of this grand vision it would be to make them all into a
platform where data can be communicated, collated, analyzed and converted into useful
information, in a secure way, which aims to make our lives better.
It is already here, miniaturization and other technological advances already make it
possible to instrument and connect virtually any object. With the IoT, small sensors are
being integrated into real-world objects, acting as instruments that offer information
about almost everything that can be measured. So the main factor that makes possible
machine to machine communication (M2M) is sensing. [1]
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 15
In this paper, we will discuss about a new promising research area called mobile
sensing. It promotes completely decentralized sensing based on smartphone
capabilities only. Recent evolutions in smartphones, such as Android and iOS, are
broadening the traditional concept of the mobile device to provide not only computing
resources, but also sensing capabilities, such as built-in sensors. These new features
make mobile devices powerful and complete sensing platforms to continuously watch
and monitor the behavior of users who move and act in the physical world bringing with
them their mobile devices.
On the other hand, developing mobile sensing applications is not widely used mainly
because there are still several open technical issues. Different devices and platforms
such as Android and iOS use very different interfaces into their sensors; privacy is
another issue because of the amount and importance of sensed data and also
monitoring tasks require intensive use of hardware sensors. In other words, can reduce
battery lifetime and should be carefully managed.
Therefore, we propose SENSOROID, a generic sensor reading framework that retrieves
data from actual sensor drivers and provides a generic outbound interface. This mobile
Android framework for mobile sensing aims to offer IoT app developers a set of
attractive facilities and functions to quickly and easily design their own mobile sensing
services.
Up until now, it has been a challenging task for software developers (especially
scientists and experimenters) to implement specialized sensor applications. In our
current implementation, available for Android, most of the typical built-in Android
sensors are already implemented. The sensing framework implementation is presented
below into the text with details. The description proposes the framework architectural
design, the framework interfaces that offer the potential of sensor access and data
gathering for further elaboration from developers. We also designed the data flow model
for a better understanding of the classes and the communication among them. This
sensing framework will accept improvement and other extending features in the future
easily accomplishing our goal to keep abstractness in all of its levels.
1.3 Paper Roadmap
The text below is structured as follows: in Chapter 2, the theoretical background is
presented, as well as some explanations about basic concepts related to our research
area; in Chapter 3, the basic characteristics and architecture of Android have been
presented, thus we are supporting our choice of mobile operating system platform; in
Chapter 4, we are covering the basic characteristics of our sensing framework, while in
Chapter 5 we outline some conclusions.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 16
2. BACKGROUND
2.1 Overview of Internet of Things
2.1.1 IoT definition
When it comes to defining Internet of Things as a new concept in technology, there are
a lot of different opinions, but the basic idea is: The Internet of Things links objects of
the real world with the virtual world, enabling connection between “things” anytime,
anyplace, with anything and anyone ideally using any network and any service.
In its most technical sense, refers to an infrastructure in which billions of sensors
embedded in common, everyday devices – “things” as such, or things linked to other
objects or individuals – are designed to record, process, store and transfer data and, as
they are associated with unique identifiers, interact with other devices or systems using
networking capabilities. Through wired connections objects obtain intelligence by
cooperating and share information about themselves to reach common goals as making
a smarter world. A world where the real, digital and the virtual are converging to create
smart environments that make energy, transport, cities and many other areas more
efficient [2].
2.1.2 Evolution of the Internet of Things
The number of Internet-connected devices surpassed the number of human beings on
the planet in 2011, and by 2020, Internet-connected devices are expected to number
between 26 billion and 50 billion. For every Internet connected PC or handset there will
be 5–10 other types of devices sold with native Internet connectivity. So the today well-
known Internet of PC’s will move toward the Internet of Things soon.
Figure 1: Internet connected-devices and the future evolution (Source: Cisco 2011)
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 17
The IoT-idea is not new. The concept of "Internet of Things" came into light in 2005
when the International Telecommunications Union published the first report on the
subject. However, IoT’s roots can be traced back to the Massachusetts Institute of
Technology (MIT), from work at the Auto-ID Center. Founded in 1999, this group was
working in the field of networked radio frequency identification (RFID) and emerging
sensing technologies. The labs consisted of seven research universities located across
four continents. These institutions were chosen by the Auto-ID Center to design the
architecture for IoT. It only recently became relevant to the practical world, mainly
because of the progress made in hardware development in the last decade. The decline
of size, cost and energy consumption, hardware dimensions that are closely linked to
each other, now allows the manufacturing of extremely small and inexpensive low-end
computers [3].
The IoT application space is very diverse and IoT applications serve different users
under three key categories - Consumers, Communities and Enterprises. There exist
numerous opportunities for IoT applications to leverage the changing dynamics of
societal trends (health and wellness, transport and mobility, security and safety, energy
and environment, communication and e-society) and market trends (consumer
electronics, automotive electronics, medical applications, communication, etc.) [4].
Due to a lack of standardization and interoperability, the Internet of Things is sometimes
seen as an “Intranet of Things” in which every manufacturer has defined its own set of
interfaces and data format. Data is then hosted in walled environments, which
effectively prevents users from transferring (or even combining) their data from one
device to another. Besides some well-known embryonic applications (Arduino,
Nabaztag, Pachube, Touchatag, etc.), today objects can only exchange information
within "intranets of things". These objects cannot yet address, any Internet of Things,
which by definition should be open, uncertain and complex. One of the main challenges
of the Internet of Things is therefore to transform connected objects into real actors of
the Internet.
Yet, smartphones and tablets have become the natural gateways of data collected
through many IoT devices to the internet. As a result, manufacturers have progressively
developed platforms that aim to host the data collected through such different devices,
in order to centralize and simplify their management. Many large technology companies
are involved in governing the IoT ecosystem through various initiatives and efforts
around standardization. There are over 14 industry bodies working on developing IoT
standards, some of the leading ones include Thread (led by Google), AllSeen Alliance
(led by Qualcomm) and Industrial Internet Consortium (IIC, led by GE) [4].
Over the next 10 to 15 years, the Internet of Things is likely to develop fast and shape a
newer "information society" and "knowledge economy", but the direction and pace with
which developments will occur are difficult to forecast. The maturity of IoT is long way
ahead, which will be characterized by rapid innovation, disruption and continuous
evolution.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 18
2.1.3 Use case scenarios and the future
Internet of Things has a very wide applicability in many areas. Today, several of IoT
scenarios are very end-user oriented and we will analyze three specific IoT
developments (Wearable Computing, Quantified Self and domotics) which are directly
interfaced to the user and correspond to devices and services that are actually in use.
 Wearable Computing
Wearable Computing refers to everyday objects and clothes, such as watches and
glasses, in which sensors were included to extend their functionalities. Wearable things
are likely to be adopted quickly as they extend the usefulness of everyday objects which
are familiar to the individual – all the more so as they can hardly be differentiated from
their unconnected look-alikes. They may embed cameras, microphones and sensors
that can record and transfer data to the device manufacturer. Furthermore, the
availability of an API for wearable devices (e.g. Android Wear3) also supports the
creation of applications by third parties who can thus get access to the data collected by
those things [5].
 Quantified Self
Quantified Self things are designed to be regularly carried by individuals who want to
record information about their own habits and lifestyles. For example, an individual may
want to wear a sleep tracker every night to obtain an extensive view of sleep patterns.
Other devices focus on tracking movements, such as activity counters which
continuously measure and report quantitative indicators related to the individual’s
physical activities, like burned calories or walked distances, among others.
Some objects further measure weight, pulse and other health indicators. By observing
trends and changes in behavior over time, the collected data can be analyzed to infer
qualitative health-related information, including assessments on the quality and effects
of the physical activity based on predefined thresholds and the likely presence of
disease symptoms, to a certain extent.
Quantified Self sensors are often required to be worn in specific conditions to extract
relevant information. For example, an accelerometer placed on the belt of a data
subject, with the appropriate algorithms, could measure the abdomen moves (raw data),
extract information about the breathing rhythm (aggregated data and extracted
information) and display the level of stress of the data subject (displayable data). On
some devices, only this latter information is reported to the user, but the device
manufacturer or the service provider may have access to much more data that can be
analyzed at a later stage.
Quantified Self is challenging with regard to the types of data collected that are health-
related, hence potentially sensitive, as well as to the extensive collection of such data.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 19
It focuses on motivating users to remain healthy, it has many connections with the e-
health ecosystem. Yet, recent investigations have challenged the real accuracy of the
measures and of the inferences made from them [5].
 Home automation (“domotics”)
Today, IoT devices can also be placed in offices or homes such as “connected” light
bulbs, thermostats, smoke alarms, weather stations, washing machines, or ovens that
can be controlled remotely over the internet. For instance, things containing motion
sensors can detect and record when a user is at home, what his/her patterns of
movement are, and perhaps trigger specific pre-identified actions (e.g. switching on a
light or altering the room temperature). Most home automation devices are constantly
connected and may transmit data back to the manufacturer [5].
From their research Bosch anticipates that the majority of IoT connected devices in
2022 will be concentrated mainly in four industries – intelligent buildings, automotive,
healthcare, and utilities. It has a huge market potential, but also a lot of challenges for
the future.
Some of them that need to be addressed include how to exchange data in a secure way
between devices, how to store and process huge amounts of data, and how to protect
their privacy. This requires enabling technologies regarding databases because an
amount of big data will be produced. This is of course related to the scale of cloud
computing. The formula for success of the IoT includes the standardization of
communication between IoT middleware solutions and the IoT market in order to create
large and successful ecosystems.
2.2 IoT as an internet of sensors
While we talked about the Internet of Things evolving challenges about wireless and
cloud technology, data storage and their privacy the IoT wouldn’t be possible without
sensors. They are necessary to turn billions of objects into data-generating “things” that
can report on their status, and in some cases, interact with their environment. In this
context, we can say that sensors play a key role in the technological advances in this
field.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 20
 Sensors, a key role in IoT
Sensors are devices capable to detect and measure changes in position, temperature,
light, etc. and turn them into electrical variables. Sensors are a bridge between the
physical world and the internet. They literally are the equivalent of human senses –
sight, hearing, touch, smell and taste. We need them to get physical data from “things”
and once is turned into an electrical equivalent it is easy to input them into a computer
for manipulating, analyzing and displaying it. In this way we can process data to make
smarter solutions and improving the quality of life. Sensors are used almost everywhere
in electronic products nowadays also we can find them in a wide variety of applications,
such as smart mobile devices, automotive systems, industrial control, healthcare, oil
exploration and climate monitoring.
There are many types of sensors: chemical, magnetic, mechanical, position, pressure,
temperature, CCD and CMOS image sensors, motion sensing, RFID etc. One form of
sensor technology is radio frequency identification (RFID), a method of identifying
distinct items using radio waves. RFID is based on tags that contain microscopic chips
used to store information about the item to which it is attached. Each year, hundreds of
millions of sensors are manufactured. The application of nanotechnology to sensors
should allow improvements in functionality leading to much decreased size, enabling
the integration of ‘nanosensors’ into many other devices [6]. Advances in technologies
as WSN (wireless sensor networks) and sensor fusion lead to a tremendous expansion
in the delivery of context-aware services customized for any given situation. To respond
to the need of Internet of Things vision having tiny wireless sensors all around us the
sensor development technology should also solve the problem of energy consumption
by making them self-sufficient and not the today power-needy devices.
Figure 2 : How a sensor works (from left to right), Source: Carré & Strauss
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 21
Figure 3 : Range of Sensors in IoT (Source: Reproduced from “Practical Electronics for Inventors”,
Paul Scherz and Simon Monk)
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 22
2.3 Sensing with mobile phones
Mobile phone sensing is an emerging area of interest for researchers as smart phones
are becoming not only the key computing and communication device, but a rich set of
embedded sensors which collectively enable new applications being the natural
gateways of data collected through many IoT devices to the internet.
2.3.1 Smartphones as a sensing platform
Phone manufacturers never intended their devices to act as general purpose sensing
devices. Sensors were only considered as tools to facilitate interaction with the phone.
However, the mobile industry has started to change direction. In the near future is
expected the release of a new hardware platform that facilitate background, sensing
and new OS frameworks that incorporate a general purpose sensing middleware. The
mobile phone is well on its way to becoming a personal sensing platform in addition to a
communication device. Sensors are becoming more prevalent in mobile devices in
recent years, making the mobile phone a sensor gateway for the individual. Phone
sensing is about sensing human activities. Today’s top end mobile phones come up
with a number of embedded specialized sensors, including ambient light sensor,
accelerometer, digital compass, gyroscope, GPS, proximity sensor and general purpose
sensors like microphone and camera. Sensing devices to smartphones provide the
opportunity to track dynamic information in social networks, green applications, global
environmental monitoring, personal and community health care and sensor augmented
gaming, virtual reality and smart transportation systems. More and more organizations
and people are discovering how mobile phones can be used for social impact, including
how to use mobile technology for environmental protection, sensing, and to leverage
real-time information to make our movements and actions more environmentally
friendly. So what better tool than the mobile phone to really launch the Internet of
Things?
Mobile devices are already equipped with a wide range of sensors as we mentioned
and it is also by definition connected to a network. This is one of the key drivers turning
mobile on a sensing platform. Most of the smartphones on the market are open and
programmable by third-party developers, and offer software development kits (SDKs),
APIs, and software tools.
Therefore, it’s easy to access sensor data and to leverage existing software to develop
new sensing applications for the general purpose of IoT. These applications rely on
advanced sensor information processes, which mainly involve raw data acquisition,
feature extraction, data interpretation and transmission. The sensor data acquired is
typically sent over the wireless communication channel (e.g., via Wi-Fi or a cellular
network) after locally performing a set of stages to select relevant features, filter
redundant information and controlling data transmission behavior through the
deployment and enforcement of low-level decision policies. At each stage, different
algorithmic solutions have been envisaged to perform learning and classification tasks,
and their main requirements depend on the application type and its impact on CPU and
battery components. An architectural model to discuss by phone sensing would consist
of three main components: sense, learn, and share [7].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 23
 Sense
Individual mobile phones collect raw sensor data from sensors embedded in the phone.
 Learn
The raw sensor data from phones are worthless without interpretation. Information from
the data collected is extracted with a variety of data mining and statistical tools. These
operations could be used either directly on the phone, in the mobile cloud, or with some
partitioning between the phone and the cloud. As such mobile is a powerful unit that can
monitor continuously a user’s ambient context in real time.
 Share
Mobile phones are not limited to simply collecting sensor data. A number of phone
sensing systems connect with existing web applications to either enrich existing
applications or make the data more widely accessible. For example, a personal sensing
application will only inform the user, whereas a group or community sensing application
may share an aggregate version of information with the broader population and
obfuscate the identity of the users. Other considerations are how to best visualize
sensor data for consumption of individuals, groups, and communities. Data sharing
helps to personalize sensing systems based on the individual user and groups of people
with similar behavior [7].
Cell phones have become an indispensable tool not only for today’s highly mobile
workforce, but also for the general people to transfer and exchange diverse mobile data.
Thus, there are open challenges regarding mobile sensing systems including:
 The sensing of people and their environment, context – awareness
 The energy-efficient use of mobile device resources
 The meaning and interpretation of mobile sensor data
 Interactions with users, largely to provide feedback and information to users.
Once these technical barriers will be over passed this new field will advance rapidly
becoming a disruptive technology for a wide range of domains.
If Internet of Things would be divided architecturally into layers we just talked about the
sensing layer through smartphones. Data provided from the sensing layer are
processed in the management layer. This layer is integral to the Internet of Things
architecture and industry chain, integrating management, control, and operations on
terminals and assets, including mobile assets. The management platform comprises the
following software sets: integrated frameworks, Internet of Things middleware, industry
suites, and industry application solutions [8].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 24
2.3.2 What is a sensing framework?
Internet of Things frameworks might help support the interaction between "things" and
allow for more complex structures like Distributed computing and the development of
Distributed applications. Currently, Internet of Things frameworks seems to focus on
real time data logging solutions offering some basis to work with many "things" and
have them interact. Software Development leads to the specific cooperation of software
with the hardware used in Internet of Things. When we talk about the interaction
between “things” we mean sensing the physical world. There is no standardized method
of obtaining data from sensors or for distributing this data to applications that use it. A
sensing framework offers a standardized, easy-to-use, and efficient interface to sensor
software developers. Sensing frameworks lets them write a sensor module focused on
producing useful interpretations of the sensor data, without considering how the
produced data will be retrieved from the hardware.
 Mobile Sensing Framework
For the mobile development interacting with the physical world is still a challenge. This
happens due to the complexities of sensor data acquisition, context modeling, and data
management. Contextual information extracted from the user’s environment can be
used to enable an app to adapt its runtime behavior and capabilities to better fit a user’s
changing situation and requirements. Also, several applications share the same needs
as for the collected sensory data. Mobile sensing frameworks are flexible platforms to
ease the development of mobile sensing applications through the definition of a
common set of facilities that mask all low-level technical details in reading and
processing raw sensor data. They are used by many independent developers to create
software that must work together so the main goal of implementing them is that sensors
be easy to write and easy to use and to provide a consistent interface for sensors
across processors and platforms. At the same time frameworks must be flexible enough
for developers to exploit all of the properties of the physical sensors and external data
sources that make up the sensor modules. We can mention that since there is a need
for unified sensing platforms because of the issues there is related work regarding
mobile sensing frameworks. There are several of them such as Fünf Open Sensing
framework, the Emotionsense framework, MSF and Purple Robot. These projects all
collect contextual data using phone sensors and provide several data storage options,
including the file-system, remote servers and cloud services such as Dropbox [9].
For the IoT implementing frameworks is useful because as we said new features make
mobile devices powerful and complete sensing platforms to continuously watch and
monitor the behavior of users who move and act in the physical world bringing with
them their mobile devices. Moreover, it is possible to process on the mobile device large
sets of locally collected raw data and to distill meaningful views of the activity currently
done by the user, such as running, cycling, talking, and sitting. In brief, mobile devices
can be converted to IoT helped by frameworks. Many mobile applications can exploit
frameworks to make use of these brand-new mobile sensing capacities and span
different areas, from healthcare to homecare, from safety to smart grids and
environmental monitoring, and many more.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 25
2.3.3 IoT applications using mobile phone based sensing
Now phones can be programmed to support new disruptive sensing applications such
as sharing the user’s real-time activity with friends on social networks such as
Facebook, keeping track of a person’s carbon footprint, or monitoring a user’s well
being. Second, smartphones are open and programmable. In addition to sensing,
phones come with computing and communication resources that offer a low barrier of
entry for third-party programmers (e.g., undergraduates with little phone programming
experience are developing and shipping applications). Third, importantly, each phone
vendor now offers an app store, allowing developers to deliver new applications to large
populations of users across the globe, which is transforming the deployment of new
applications, and allowing the collection and analysis of data far beyond the scale of
what was previously possible. Fourth, the mobile computing cloud enables developers
to offload mobile services to back-end servers, providing unprecedented scale and
additional resources for computing on collections of large-scale sensor data and
supporting advanced features such as persuasive user feedback based on the analysis
of big sensor data.
These key factors lead to a high potential development for IoT applications deployment.
There are three big categories we can divide mobile sensing applications:
 Individual activity sensing: fitness applications, behavioral suggestions.
 Group activity sensing: group to sense common activities and help achieving
group goals. E.g.: neighborhood safety, collective recycling efforts.
 Community sensing: large scale sensing, where a large number of people have
the same application installed. E.g., tracking speed of disease across a city,
congestion in the city [10].
Next, let’s mention the most important domains and examples.
 Physical activity
These kinds of applications use sensors as accelerometer, gyroscope and compass for
activities as walking or running.
Examples: Health-Calorie Tracking, Presence sharing
 Transportation Mode
Applications related to transportations use sensors as accelerometer, GPs, wifi for
location specification. Mobile sensing systems as the MIT VTrack are being used to
provide traffic information on a large scale for improving commute planning.
 Healthcare
The UbiFit Garden, a joint project between Intel and the University of Washington,
captures levels of physical activity and relates this information to personal health goals
when presenting feedback to the user.
 Environment
Conventional ways of measuring and reporting environmental pollution rely on
aggregate statistics that apply to a community or an entire city. The University of
California at Los Angeles (UCLA) PEIR project uses sensors in phones to build a
system that enables personalized environmental impact reports, which track how the
actions of individuals affect both their exposure and their contribution to problems such
as carbon emissions [10].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 26
3. ANDROID AS A MOBILE OPERATING SYSTEM
3.1 Android
3.1.1 Background and history
Android is described as a Linux based mobile operating system, initially developed by
Android Inc. Later Android Inc. was acquired by Google 2005. Google, as well as other
members of the Open Handset Alliance (OHA) collaborated on Android (design,
development, and distribution). Currently, the Android Open Source Project (AOSP) is
governing the Android maintenance and development cycle. [11]
To reiterate, the Android operating system is based on a modified Linux 2.6 kernel [12].
Compared to a Linux 2.6 environment though, several drivers and libraries have been
either modified or newly developed to allow Android to run as efficiently and as
effectively as possible on mobile devices (such as smart phones or internet tablets).
Some of these libraries have their roots in open source projects. Due to some licensing
issues, the Android community decided to implement their own c library (Bionic), and to
develop an Android specific Java runtime engine (Dalvik Virtual Machine – DVM).
With Android, the focus has always been on optimizing the infrastructure based on the
limited resources available on mobile devices.To complement the operating
environment, an Android specific application framework was designed and
implemented. Therefore, Android can best be described as a complete solution stack,
incorporating the OS, middle-wear components, and applications.
3.1.2 Architecture
Figure 1 outlines the current (layered) Android Architecture. The modified Linux kernel
operates as the HAL, and provides device driver, memory management, process
management, as well as networking functionalities, respectively. The library layer is
interfaced through Java (which deviates from the traditional Linux design). It is in this
layer that the Android specific libs (Bionic) are located. The surface manager handles
the user interface (UI) windows. The Android runtime layer holds the Dalvik Virtual
Machine (DVM) and the core libraries (such as Java or IO). Most of the functionalities
available in Android are provided via the core libraries.
The application framework holds the API interface. In this layer, the activity manager
governs the application life cycle. The content providers enable applications to either
access data from other applications or to share their own data. The resource manager
provides access to non-code resources (such as graphics), while the notification
manager enables applications to display custom alerts. On top of the application
framework are the built-in, as well as the user applications, respectively. It has to be
pointed out that a user application can replace a built-in application, and that each
Android application runs in its own process space, within its own DVM instance.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 27
Figure 4: Android Architecture (Source: Amey Kelkar)
3.1.3 Android SDK
The Android Software Development Kit (SDK) is a set of development tools used, to
develop applications for Android platform. The Android SDK has a modular structure,
which means that the major components of the SDK are collected into separate
packages. This makes it easy to install only the components you need for your
particular unique use case. The packages you install are determined by the version of
the OS you are targeting, if you use third-party services (like Google Maps or Analytics),
and if you plan to support specific hardware (like a particular chipset or a dual screen).
The modular structure has two important benefits. The first is that disk storage is not
wasted on downloading unnecessary components. This is important because each
platform requires at least 100MB of space, and this can grow rapidly when optional
packages are included. The other advantage is that managing dependencies within a
project is streamlined because it is possible to control exactly which software you are
working with, and install only the components you require [13].
It is important to understand the various components that are available. They are
organized into categories:
 Required libraries
 Debugger
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 28
 An emulator
 Relevant documentation for the Android application program interfaces (APIs)
 Sample source code
 Tutorials for the Android OS
As the SDK is free and easy to install and use across multiple platforms, it is attractive
for the application developers to implement their own ideas leading to a very rich
Android application market [14].
3.1.4 Application features
As mentioned before, Android applications are developed in the Java programming
language. The Android Software Development Kit (SDK) along with any data and
resource files—into an APK, compiles the application source code: an Android package,
which is an archive file with an .apk suffix. These types of files are the ones that the
Android devices use to install the applications.
In most cases, every Android application runs in its own Linux process. Each process
has its own virtual machine, not to interfere with each other. This process is created for
the application when some of its code needs to be run, and will remain running until it is
no longer needed and the system needs to reclaim its memory for use by other
applications.
An unusual and fundamental feature of Android is that application process's lifetime is
not directly controlled by the application itself. Instead, it is determined by the system
through a combination of the parts of the application that the system knows are running,
how important these things are for the user, and how much overall memory is available
in the system. To determine which processes should be killed when low on memory,
Android places each process into an "importance hierarchy" based on the components
running in them and the state of those components. [15]
Android applications are mainly made of different essential building blocks, called
application components. Each component defines a way to have an access to the
application. Android defines four types of components:
1.Activities: Each activity corresponds to a screen of the application and implements the
methods related to the layout. 

2.Services: A service works in a similar way as an activity, but without a user interface.
It is used to run long time background operations. 

3.Content providers: This component is used to access or even modify data from
outside the application. (e.g. to access and/or modify the contact information
stored in the device) 

4.Broadcast Receivers: It is a component that can be executed when a system-wide
broadcast message is sent from any point of the system. It can be used to wake
up the application of a specific event (e.g. for a battery state change or time
change) [16].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 29
3.2 Android and the Internet of Things
Android will significantly help original equipment manufacturers (OEM) in rolling out IoT
devices. Intelligent IoT devices, that need to do more than report data (such as a
personal health monitor device, similar to the size of a small phone, that say the aged
may keep that collects and analyzes data from various body sensors), also powerful IoT
devices, such as home appliances, watches, and car dashboards will benefit greatly
from Android. And of course, one cannot avoid mentioning Google Glass when
speaking of IoT.
The key reasons why 'IoT devices' will use Android:
 Android as a stable and free OS that has already been validated in the
smartphone market as an OS capable of running on embedded systems.
 As of early 2013, The Google Play marketplace lists over 800,000+ applications,
which means there is a vibrant developer ecosystem building innovative
solutions. If an IoT device uses Android as its base OS, it can automatically
leverage these applications and you can be rest assured that developers will
start building a slew of innovative apps on that platform. Android as a stable and
free OS that has already been validated in the smartphone market as an OS
capable of running on embedded systems.
 You can customize Android as much as you want – it is possible to add your own
features right into the OS to make it just right for your environment. At that point
simply isn't another embedded OS that combines this flexibility along with having
so many 3rd party apps and such widespread adoption.
 Android already supports key communication protocols like NFC, Bluetooth
Smart (low energy profile), WiFi-Direct and others, which are key for IoT devices.
 Android already supports a wide variety of form factors and the APIs are tuned to
give developers control on how to write apps for a wide variety of sizes (they
factor in both screen sizes and pixel density).
 Android apps are written in Java. Java resources are probably much easier to
find and also it is easier to maintain Java Code from a cost perspective.
Although, being able to adapt Android to your specific device needs is a big challenge
for many OEMs. The Android SDK has excellent documentation. But if a 'custom ROM'
or 'custom Image' of Android needs to be created by recompiling source, there is very
little documentation on how to modify Android.
Bottom line, while the promises of Android are many, it is also true that adapting
Android to a device is also very complicated [14].
3.3 Choosing Android as the Implementation Platform
The choice of operating system is critical when designing a framework that will be
deployed on multiple device types. Common mobile platforms include Blackberry,
Windows Phone, Apple iOS, or Google Android as the embedded OS.
According to a report from Strategy Analytics, Google’s Android continues to dominate
the smartphone operating system market. It is important to mention, that Android has
sophisticated application marketplaces, which provide easy access to both software
distribution and maintenance channels.
We choose Android as the target platform for the SENSOROID Sensing framework
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 30
because it is open source and has extensive support for background processes and
includes several built-in constructs for inter-application communication (IPC) between
Android Applications.
Figure 5: Global smartphone operating system market
Additionally, Android supports multiple communications APIs that facilitate connecting to
a wide variety of external sensors as was described in previous subsection [16].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 31
4. ANDROID SENSING FRAMEWORK
4.1 Framework Description
4.1.1 Core functionality - Overview
Our objective of this work is to implement an easy-to-use sensing framework that
developers and researchers can use to provide continuous sensing in Android
applications.
The model we are following provides sensor data to anyone who requires access to it
on demand [17]. Sensing as a service model does not collect sensor data from all the
available sensors at all times, but maintains track of the individual sensors, their
accessibility, and capabilities. However, they do not collect sensor data, unless a
developer makes a request [18].
Figure 6: SENSOROID framework structure
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 32
Thus, the framework architecture should be modular and easy to use at two levels;
application developers should be able to quickly create new applications based on raw
data and/or already computed high-level inferences, which is freeing from
understanding the specifics of the underlying communication between build-in sensors
on Android devices; library developers should be capable to easily plug-in new
components, such as support for new sensors and activity classifiers [9].
In particular, SENSOROID consists of multiple system services and Android sensor
event listener, that collects sensor values using event handlers and sends them using
custom listeners. The goal is to achieve a generic framework architecture
implementation.
4.1.2 Implementation environment
4.1.2.1 Eclipse IDE
Most people are aware of Eclipse as an integrated development environment (IDE) for
Java. Even though it is known that Android Studio is now the official IDE for Android, we
have chosen Eclipse as the implementation environment of the SENSOROID
framework. Android offers a custom plugin for the Eclipse IDE, called Android
Development Tools (ADT). This plugin provides a powerful, integrated environment in
which to develop Android apps. It extends the capabilities of Eclipse to let you quickly
set up new Android projects, build an application’s UI, debug your application, and
export signed (or unsigned) application’s packages (APKs) for distribution. In our project
we are using the latest ADT version, 23.0.4.
4.1.2.2 Application Program Interface and Test specifications
SENSOROID was developed on Mac OS Yosemite (version 10.10.2) and Windows 8
using the Android Framework API level 21, targeting device platform versions 8 and
newer.
Two mobile smartphones were used to test the framework, running KitKat (4.4.2) and
Jelly Bean (4.2.2) Android mobile operating systems.
The Linux Kernel versions varied between devices tested, ranging from 3.4.0 to 3.4.5.
4.2 SENSOROID Framework Fundamentals
4.2.1 Framework components
4.2.1.1 Activities
Activity is a type of Android component that is in charge of the user interface (UI) items
management and user interaction event configuration. Almost all activities require
interaction with the user, and for that reason, the activity takes care of creating the
window and laying out the UI components [19].
Typically, one activity in an application is specified as the "main" activity, which is
presented to the user when launching the application for the first time. Each activity can
then start another activity in order to perform different actions. Each time a new activity
starts, the previous activity is stopped, but the system preserves the activity in a stack
(the "back stack"). When a new activity starts, it is pushed onto the back stack and
takes user focus. The back stack abides to the basic "last in, first out" stack mechanism,
so, when the user is done with the current activity and presses the Back button, it is
popped from the stack (and destroyed) and the previous activity resumes.
When an activity is stopped because a new activity starts, it is notified of this change in
state through the activity's lifecycle callback methods.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 33
The activity life cycle is managed by the Activity Manager, a service that runs inside the
Android Framework layer of the stack. It is responsible for creating, destroying, and
managing activities. For example, when the user starts an application for the first time,
the Activity Manager will create its activity and put it onto the screen [20].
To create a new activity, you must simply derive a new class from the
android.app.Activity class. Seven life cycle states can be implemented (see Figure 7),
such as onCreate(), onStart(), onRestart(), onResume(), onPause(), onStop(),
onDestroy().
Figure 7: Android Activity Lifecycle (Source: Android Developers)
At first, when activity starts, methods onCreate(), onStart() and onResume() are
executed. onCreate() is called when the activity gets created. All activities implement
this method in order to initialize the activity and its UI. If another activity comes into the
foreground, the running activity goes onPause(). Paused activities still have high priority
in terms of getting memory and other resources. If there is no enough memory for a
foreground activity, the activity stored in the background will be destroyed. That means
that if there is important information, it is better to save it while running the onPause()
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 34
method.
At the same time, if there is some information that needs to be reloaded or reconfigured,
it should be done while running the onResume() method. At this state, the user can
interact with the application.
4.2.1.2 Services
A Service is an application component that can perform long-running operations in the
background and does not provide a user interface. Another application component can
start a service and it will continue to run in the background, even if the user switches to
another application [15].
Services have a much simpler life cycle than activities (see Figure 8). You either start a
service or stop it. Also, the service life cycle is more or less controlled by the developer,
and not so much by the system [20].
Figure 8:Android Service lifecycle (Source: Android Developers)
A service can essentially have two states, started and bound. A service is "bound" when
an application component binds to it by calling bindService().
A service is "started" when an application component (such as an activity) starts it by
calling startService(). Once started, a service can run in the background indefinitely,
even if the component that started it is destroyed. Usually, a started service performs a
single operation and does not return a result to the caller. When the operation is done,
the service should stop itself [15].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 35
To allow the components to start the service some callback methods have to be
implemented. onCreate() is called when the service is first created. onStartCommand()
method is called every time the service is explicitly started through an intent. While the
service is running, this method can be called multiple times by the system to queue
more work for the service. onDestroy() is called when the service is no longer in use or
being destroyed. The service should clean up any resources it holds. This is the last call
the service receives.
The service start strategy can be adjusted through the return value of the
onStartCommand() method. In SENSOROID framework this method returns
START_NOT_STICKY constant value, which tells the OS not to bother recreating the
service again.
4.2.2 Intents
Intents are messaging objects that are sent among the building blocks. They trigger an
activity to start up, tell a service to start, stop, or bind to, or are simply broadcasts.
Intents are asynchronous; meaning the code that sends them doesn’t have to wait for
them to be completed. Applications use Intents for both inter-application communication
and intra-application communication. Additionally, the operating system sends Intents to
applications as event notifications. Some of these event notifications are system-wide
events that can only be sent by the operating system. We call these messages system
broadcast Intents [21].
Intent could be explicit or implicit. In an explicit intent, the sender clearly spells out
which specific component should be on the receiving end. In an implicit intent, the
sender declares a general action to perform, which allows a component from another
app to handle it.
When you create an implicit intent, the Android system finds the appropriate component
to start by comparing the contents of the intent to the intent filters declared in the
manifest file of other apps on the device. If the intent matches an intent filter, the system
starts that component and delivers it the Intent object. If multiple intent filters are
compatible, the system displays a dialog so the user can pick which app to use.
An intent filter is an expression in an app's manifest file that specifies the type of intents
that the component would like to receive. For instance, by declaring an intent filter for an
activity, you make it possible for other apps to directly start your activity with a certain
kind of intent. Likewise, if you do not declare any intent filters for an activity, then it can
be started only with an explicit intent [15].
Although intents facilitate communication between components in several ways, there
are three fundamental use-cases:
 startActivity()
Activities are started with Intents, by passing an Intent to startActivity().
 startService()
Intents are used to start and bind to Services, by passing an Intent to
startService(). The Intent describes the service to start and carries any
necessary data.
 sendBroadcast()
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 36
Broadcast Receivers receive Intents to multiple applications. They are triggered by
the receipt of an appropriate Intent and then run in the background to handle the
event.
Intents can be sent between three of the four components: Activities, Services, and
Broadcast Receivers. Intents can be used to start Activities; start, stop, and bind
Services; and broadcast information to Broadcast Receivers. For a Service or Activity to
receive Intents, it must be declared in the manifest.
4.2.3 Context
Context is an abstract class whose implementation is provided by the Android system, it
has access to global information about an application environment. Context helps the
current activity to interact with local files, databases, class loaders associated with the
environment, services including system-level services, and more.
The context can be accessed by invoking getApplicationContext(), getContext(),
getBaseContext() or this (in the activity class).
Typical uses of context:
 Creating New objects: new views, adapters, listeners
 Accessing Standard Common Resources; Services like
LAYOUT_INFLATER_SERVICE, Shared Preferences:
context.getSystemService(), getApplicationContext().getSharedPreferences()
 Accessing Components Implicitly. Regarding content providers, broadcasts,
intents: getApplicationContext().getContentResolver().query(uri, ...).
Finally, both the Activity and Service classes are subclasses of the Context class. The
instances of them can be used wherever a context needed. In case of our framework,
we were in need of accessing the application context outside the classes mentioned
above. A new class called ApplicationContextProvider was created to support a generic
architecture of the framework and to provide the context anywhere outside an Activity or
Service. Also, it had to be declared in AndroidManifest.xml in the application tag, after
that it can be accessed by simply calling ApplicationContextProvider.getContext().
4.2.4 Listeners
Listeners are the objects, which are watching the state changes. The
SensorEventListener is an interface that provides the callbacks to alert an app to
sensor- related events. To be made aware of these events, an app registers a concrete
class that implements SensorEventListener with the SensorManager.
SensorManager is the Android system service that gives an app access to hardware
sensors. Like other system services, it allows apps to register and unregister for sensor-
related events. Once registered, an app will receive sensor data from the hardware [34].
An application, or in our case a sensing framework, must implement
SensorEventListener and contain implementation for both onSensorChanged() and
onAccuracyChanged() to receive sensor data, also to extract data from SensorEvent
depending on the sensor type, and ensure that an application/framework unregisters it
at the right time.
It is important to remember to unregister sensor listeners whenever they are not in use.
Not doing so drains the battery and uses system resources, including the garbage
collector. Android does not take care of this by itself when another Activity comes to the
foreground or when the screen is turned off — it is in the hands of the app developer to
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 37
control listeners wisely. If Android kills the application, however, it also unregisters
listeners [22].
4.2.5 Android Manifest
This file contains information like Android version, the permissions; the main activity is
also declared in the Manifest in order to be shown at first when the application is
launched.
The Manifest must contain all of the Android components (activities, services, content
providers or broadcast receivers) that the application can use. Each component is
related to the class that implements it and the conditions under which components are
executed.
The Android applications, by default, are not able to access any other part of the system
outside the application components. To have an access to any system resource or
content, the application must declare the required permissions in order to let the user be
aware of the application scope.
4.2.6 Sensors and Location
A fact that contributes in making Android powered devices really powerful is the several
built in sensors they have. Android devices are commonly manufactured with several
sensors that can be used in any application. The Android operating system is already
designed to manage all these built in sensors with an easy to use framework for Android
application development.
As there are many possible sensors to be implemented on an Android device, they are
divided in three main groups:
 Motion sensors: These sensors measure the forces applied over the device, that
means acceleration and rotation forces. The most known sensors are the
accelerometer and the gyroscope.
 Environmental sensors: It is the type of sensors that measure environmental
parameters like temperature, light, humidity, etc. Typical environmental sensors
are the barometers, thermometers and photometers.
 Position sensors: Position sensors measure the parameters that determine the
physical position of the device. The most common sensors are the
magnetometer and the proximity sensor.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 38
Our framework provides access to 12 Android sensors (see figure 9), which values can
be retrieved easily by a developer.
Figure 9: SENSOROID implemented sensors
1. Sensor.TYPE_LIGHT
The light sensor is a hardware sensor visible in front side of the device. It is a
simple photodiode, which generates a voltage when light is incident on it. The
light sensor measures the ambient light level (illumination) in Lux (lx) and has a
dynamic range between 1 and 30,000 Lux. Mostly it is used to adjust screen
brightness according to ambient light.
2. Sensor.TYPE_ACCELEROMETER
The accelerometer is a hardware sensor used to detect a shake, tilt motion, etc.
Android reports acceleration force in m/s2. Retrieving values from the
accelerometer, you will receive X, Y and Z (including the force of gravity) values
that correspond to the following axes (see figure 10).
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 39
Figure 10: Accelerometer axes (Source: Android Developers)
They are typically used in one of three modes: As an inertial measurement of
velocity and position; As a sensor of inclination, tilt, or orientation in 2 or 3
dimensions, as referenced from the acceleration of gravity; As a vibration or
impact (shock) sensor.
3. Sensor.TYPE_GRAVITY
The gravity is hardware or software sensor, which is originated from the 3-axis
acceleration sensor. It measures the vector components of gravity when the
device is at rest or moving slowly. When a device is at rest, the gravity sensor
should measure equally as the accelerometer.
Gravity is a unit vector (scalar = 1). The gravity sensor outputs 3 Cartesian axis
values. When a device is accelerated in the ±X, ±Y, or ±Z direction, the
corresponding output increases (+) or decreases (-). Acceleration changes
generated by gravity are sensed in opposite directions.
4. Sensor.TYPE_GYROSCOPE
The gyroscope sensor measures the rate of rotation in rad/s, which is calculated
using the measurement data, retrieved from a 3-axis gyroscope. You cannot
directly measure angles using a gyroscope. However, often the gyroscope values
are integrated over time to calculate an angle.
Rotation is positive in the counter-clockwise direction; that is, an observer looking
from some positive location on the x, y or z axis at a device positioned at the
origin would report positive rotation if the device appeared to be rotating counter
clockwise. This is the standard mathematical definition of positive rotation and is
not the same as the definition of roll that is used by the orientation sensor.
Standard gyroscopes provide raw rotational data without any filtering or
correction for noise and drift (bias). In practice, gyroscope noise and drift will
introduce errors that need to be compensated for. You usually determine the drift
(bias) and noise by monitoring other sensors, such as the gravity sensor or
accelerometer.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 40
5. Sensor.TYPE_LINEAR_ACCELERATION
The linear acceleration sensor provides you with a three-dimensional vector
representing acceleration along each device axis, excluding the force of gravity.
This sensor is needed to obtain acceleration data without the influence of gravity.
The linear acceleration sensor always has an offset, which you need to remove.
The simplest way to do this is to build a calibration step. During calibration you
can ask the user to set the device on a table, and then read the offsets for all
three axes. You can then subtract that offset from the acceleration sensor’s direct
readings to get the actual linear acceleration. The sensor coordinate system is
the same as the one used by the acceleration sensor, as are the units of
measure (m/s2 ).
6. Sensor.TYPE_MAGNETIC_FIELD
Magnetic field sensors report the magnetic field in x, y, and z (by having three
separate sensors, one aligned along each axis). Android reports magnetic fields
in micro-Tesla (μT). A typical dynamic range is around 2000 micro-Tesla.
In the absence of magnetic sensor you can see your position on the Map, but not
your orientation (the Map doesn’t rotate when you do). Magnetic sensor helps
you to take that first step in the correct direction with the help of the direction
pointer.
7. Sensor.TYPE_PRESSURE
This constant refers to a MEMS barometer, which measures air pressure. This
sensor is currently available only in a few devices its primary use is for
determining altitude in places where the device cannot get a GPS fix, such as
locations inside a building. This measurement of pressure can be used to
forecast short-term changes in the weather and can be used to estimate the
altitude. The ambient air pressure is returned with one value in hectopascals or in
millibars (hPa or mbar).
8. Sensor.TYPE_PROXIMITY
The proximity hardware sensor is used to calculate how far away an object is
from a device and it is usually used to determine how far away a person's head is
from the face of a handset device (for example, when a user is making or
receiving a phone call).
Most proximity sensors return the absolute distance in cm, but some return only
near and far values in binary.
9. Sensor.TYPE_ROTATION_VECTOR
The rotation vector represents the orientation of the device as a combination of
an angle and an axis, in which the device has rotated through an angle θ around
an axis (x, y, or z) and returns 4 values. The three elements of the rotation vector
are <x*sin(θ/2), y*sin(θ/2), z*sin(θ/2)>, such that the magnitude of the rotation
vector is equal to sin(θ/2), and the direction of the rotation vector is equal to the
direction of the axis of rotation. The fourth value is the scalar component
(cos(θ/2)) and it is optional.
The rotational vector sensor is particularly versatile and can be used for a wide
range of motion-related tasks, such as detecting gestures, monitoring angular
change, and monitoring relative orientation changes. For example, the rotational
vector sensor is ideal for developing a game, an augmented reality application, a
2-dimensional or 3-dimensional compass, or a camera stabilization app [29].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 41
10. Sensor.TYPE_AMBIENT_TEMPERATURE
The ambient (room) temperature sensor measures the ambient air temperature,
returning one value in Celsius (˚C). Not many devices have this sensor available.
The raw data you acquire from the ambient temperature sensor usually requires
no calibration, filtering, or modification.
This sensor is meant to replace the use of Sensor.TYPE_TEMPERATURE,
which has been deprecated.
11. Orientation Sensor
The orientation sensor derives its data by processing the raw sensor data from
the accelerometer and the magnetic field sensor. Because of the heavy
processing that is involved, the accuracy and precision of the orientation sensor
is diminished (specifically, this sensor is only reliable when the roll component is
0). Instead of using raw data from the orientation sensor, the getRotationMatrix()
method is used in conjunction with the getOrientation() method to compute
orientation values. [29].
The getOrientation() returns 3 values: the azimuth (rotations about the z-axis),
pitch (rotation about x-axis), and roll (rotation about the y-axis) in radians. More
specifically, the accelerometer, and magnetometer measurements must be
passed into getRotationMatrix(), which populates rotation Matrix. The generated
rotation matrix is then passed into getOrientation() to get yaw, pitch, and roll.
12. Location
Android powered devices have two ways of requesting location updates. The
most known and the most accurate one is the GPS (Global Positioning System).
However, the GPS locating mechanism can only be used outdoors and it
consumes a lot of energy.
The other available location strategy is via cellular networks. Android's Network
Location Provider determines user location using cell tower and Wi-Fi signals,
providing location information in a way that works indoors and outdoors,
responds faster, and uses less battery power.
In order to receive location updates from NETWORK_PROVIDER or
GPS_PROVIDER, user permission must be requested by declaring either the
ACCESS_COARSE_LOCATION or ACCESS_FINE_LOCATION permission,
respectively in the Android manifest file. Without these permissions, the
application will fail at runtime when requesting location updates [29].
A location can consist of latitude, longitude, timestamp, and other information
such as bearing, altitude and velocity [29]. All locations are generated by the
LocationManager, which provides access to the system location services and
allows telling Android when it is interested in receiving updated location and
when it no longer wants location updates. The LocationManager also can provide
cached location of the device, the current state of the location system such as
available location providers, and GPS status information (see figure 11).
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 42
Figure 11: Android location components (Source: [12])
All LocatonProviders generate location data differently, but they communicate
with an application the same way and provide similar data in the same manner.
The LocationListener interface contains a group of callback methods that are
called in reaction to changes in a device’s current location or changes in location
service state. Objects that implement LocationListener are notified of location
updates by a call to their onLocationChanged() method. The specific
LocationListener instances which will be notified about a location update are
registered with LocationManager. When the LocationManager has a new location
to offer, it makes a call to onLocationChanged() for each listener [22].
The Android operating system can provide parameters from sensors that are called
either software based or virtual sensors. That means that the data acquired calling
these sensors is not data measured directly from a hardware sensor. In this case, the
values can be computed from several hardware-based sensors. An example of that is
the orientation sensor described above, that combines the data from the magnetic field
sensor and the accelerometer. Other cases of virtual sensors are the linear acceleration
sensor and the gravity sensor that both compute their values from the accelerometer
hardware based sensor.
A fact that makes easier to develop sensor based Android applications is the fact that
the Android sensor framework provides very accurate values from the sensors and well
formatted in units of the SI (International System of Units).
Android does not specify a standard sensor configuration for devices, which means that
device manufacturers can incorporate any sensor configuration they want into their
Android-powered devices. As a result, devices can include a variety of sensors in a
wide range of configurations [15].
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 43
4.3 Development Requirements and Constraints
While it comes to design a framework are quite a few documented methodologies. Most
of them, though, admit the fact that the objective is to identify abstractions with a
bottom-up approach: start by examining existing solutions and be able to generalize
from them. When adopting a framework development process it is important to take into
account that any framework will usually start as a white-box framework and ideally
evolve into a black-box one. It is also important to understand that developing a
framework is more difficult than developing an application because a framework is a
piece of software you need to create your application. It can be a library, a collection of
many libraries, a collection of scripts etc. There are two different activities in framework
development: core framework design and framework internal increments. The core
framework design comprises both abstract and concrete classes in the domain. The
concrete classes are intended to be invisible and transparent to the end user (e.g. a
basic storage utility) while the abstract classes are either intended to be invisible or to
be used through subclassing. On the other hand framework internal increments build
additional classes that form a number of class libraries. A number of activities can be
identified in the first development phase, namely domain analysis, architectural design,
framework design, framework implementation, framework testing an application
testing. With frameworks, developers don't have to start from scratch every time they
write a particular application. Frameworks help developers to devote their time to
meeting software requirements rather than dealing with the more standard low-level
details reducing in this way the overall development time. They also provide a well-
designed and thought out infrastructure so that when new pieces are created and
added, they can be integrated with minimal impact on other pieces of the framework.
A few design principles we had while designing our framework were:
1. Implement the high-level requirements through Abstract classes and Interfaces
2. Provide utility classes that might be useful for Framework users.
3. Consider what should be internal - kind of metadata - that shouldn't be shown to
framework users ( application developers ).
4. Name things clearly, but concisely. This is probably one of the hardest tasks for
framework developers - especially when implementing abstract or generalized
concepts. Some names have multiple meanings and may be interpreted differently
than you might expect.
Talking more precisely about SENSOROID as a sensing framework that is designed to
be useful not just for Android operating systems an important constraint was to hide
Android specific code about sensors and manage with context. At this point we made
some decisions while implementing this framework:
 We chose interfaces rather than abstract classed because interfaces promote
composition over inheritance. Inheritance from our point of view introduces a whole
extra layer of complexity as you need to work out the relationships between
superclasses and subclasses while interfaces are usually cleaner for the reason we
mentioned above.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 44
 Our main objective was to think about what the developer should expect. Basically,
this relates to data access from sensors. We tried to make it flexible and useful for
framework users to get sensor data without the requirement of Android
development.
 Through all the development steps we designed for testability.Below we explain a
sample application which others can use to understand the framework functionality.
 Finally, we think that SENSOROID is easy to expand for new features and to grow
maintaining this design structure.
4.4 Framework Architectural Design
It was presented before in the text the list of Android structures used in the sensing
framework implementation. These features are used for packaging, storing or sending
sensor data and creating in this way the basis of a common interface, creating in this
way the basis of a communication bridge between framework users and sensor drivers.
The figure nr.13 shows the architecture design of the SENSOROID framework in every
level via the Sensor Interface and the Sensor Data Listener. Framework users
(developers) only need to implement the specific methods that handle sensor data
received from the bottom level. The framework supports multiple calls by providing
abstractions that hide sensor specific Android properties. SENSOROID exposes all 11
built-in Android sensors (for Android 4.4.2 and greater) creating a single communication
channel to manage them.
4.5 Framework Development Analysis
It is important in this part to talk about how SENSOROID framework was developed by
analyzing step by step all the programming approaches we made to achieve the desired
result. There are three main points we have to emphasize while explaining the
Figure 12: SENSOROID architecture of all levels
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 45
framework program structure: sensor interface, sensor manager via services and
sensor data acquisition. Our framework is composed of Java files (Android 4.4.2 or
greater) and for every package it is explained what functionality it has and the process
of components cooperation.
4.5.1 Framework Interface
Sensor interface’s role is to communicate with low-level tasks offering encapsulation of
complexities linked to each sensor. It enables to developers to implement all the built-in
Android sensors through two methods:
 startSensor (Sensor Type);
 stopSensor (Sensor Type);
These two methods are defined in the file Sensor.java. As it is visible, sensor interface
requires the sensor type to control a specific sensor in the framework, meaning with
sensor type e.g Light, Accelerometer, Pressure, etc. Once the method
startSensor(Sensor Type); is called a service starts working on the background for each
sensor type the developer wants. In addition to, the service runs as long as stopSensor
(Sensor Type); method is called.
4.5.2 Sensor Interface Implementation
In the file SensorImpl.java we developed the implementation of sensor interface. For
each sensor type is implemented the start and the stop of the responsible service. To
make these operations is needed to call startService() and stopService(). As services
are an application android component and a subclass of Context we had to retrieve
context inside the framework to make it a generic and easy-to-use program for
developers as we mentioned before. So, to realize this we used
ApplicationContextProvider.java util class that provides Context in every Java class in a
static way. It could lead at some problems generally, but in this particular case this
approach is a good practice because we have to do with passing a context beyond the
scope of an Activity and it helps also to avoid memory leaks. To start sensor services
from an activity we use an explicit intent with startService(). The method returns
immediately and the Android system calls the service responsible method to begin
running. The intent delivered is the only communication node between the application
component and the service.
4.5.3 Sensor Manager via Services
The entry point to the Android Sensor API is the SensorManager class, which allows to
request sensor information and register to receiver sensor data. For each call to the
service, the Sensor Manager dispatches the commands to the appropriate sensor
object that, in turn, utilizes a sensor driver to perform specific low-level tasks.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 46
4.5.3.1 Sensor Services
Once the startService(…) method is called the onStartCommand(Intent intent, Int flags,
intstartId) method creates the SensorManager running service on the background. You
can register just one sensor per service so we have a different service file for each
sensor type. The service will at this point continue running until stopService() or
stopSelf() is called no matter how many times it is started.
4.5.3.2 Sensor Manager
SensorManager is the Android system service that gives an app access to hardware
sensors. Like other system services, it allows apps to register and unregister for sensor
events. Once registered, an app will receive sensor data from the hardware.
You can access SensorManager via getSystemService(SENSOR_SERVICE).
The Sensor class defines several constants for accessing the different sensors (e.g
Sensor. TYPE_ACCELEROMETER).
You can access the sensor via the sensorManager.getDefaultSensor() method, which
takes the sensor type and the delay defined as constants on SensorManager as
parameters. The sensor returned from getDefaultSensor() method may be either a raw
sensor or a synthetic sensor that manipulates raw sensor data [22].
4.5.4 Gathering Sensor Data
We give the possibility in our framework of data acquisition from the sensors and in this
part we explain which are the classes and the components involved in this process and
their relationship.
Figure 13: Sensor Manager interaction with sensor hardware (Source:Tizen Project)
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 47
4.5.4.1 Sensor Event Listener Interface
The SensorEventListener is an interface that provides the callbacks to alert an app to
sensor-related events. To be made aware of these events, an app registers a concrete
class that implements SensorEventListener with the SensorManager.
The SensorEvent is the data structure that contains the information that is passed to an
app when a hardware sensor has information to report. A SensorEvent object is passed
from the sensor system service to callback methods on SensorEventListener. The
listener then processes the data in a SensorEvent object in an application-specific
manner. The onSensorChanged(SensorEvent event) method is called every time a
sensor value is changed. As a common listener Sensor Event Listener should be
registered and unregistered when the SensorManager service respectively starts and
stops.
4.5.4.2 Custom Data Listener Interface
The Custom Data Listener Interface also as Sensor Interface provides the access point
for the application developers to interact with SENSOROID framework. We said before
that we can provide access to sensor data through SensorEventListener, but this is an
Android interface and we need our framework to be used not just from Activities or
Fragments so to achieve this we developed our own Listener Interface that is called
also every time sensor values are changed. The method developers can use to manage
sensor data by implementing the listener is receiveSensorData() that maps them for
each sensor type in a uniform way. Below you can see the sequence diagram of the
listening process by the Custom Listener.
4.5.4.3 Broadcast Receivers
Besides a Custom Data Listener Interface the possibility to gather sensor data can offer
also the Broadcast Receiver class through its abstract method onReceive(Context,
Intent). In the beginning, we started using receivers to get data values, but it was
difficult to keep a high level of abstractness because of the Android components such
as Context or Intents that receivers need to get triggered.
Diagram 1: Custom Data Listener Interface activity sequence diagram
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 48
4.6 Data Flow Modulation
 Framework Class Diagram
In the previous sections of our paper, we described the SENSOROID framework
architecture and implementation in details explaining every system unit and their
communication channels. To make clear the interactions between classes above is
presented the class diagram. It is clearly visible the independence between classes
achieving in this way the abstractness we need for our framework.
Diagram 2: SENSOROID class diagram
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 49
 Start and Stop Sensor Sequence Diagram
In this sequence diagram we show how we dealt with context – awareness in the sensor
interface and our solution to hide Android specific code for sensors.
Diagram 3: Start Sensor method activity sequence diagram
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 50
 Stop Sensor Sequence Diagram
It was implemented the same pattern to call stop sensor method in the framework
interface. As it seems from the diagram every sensor follows a common way to interact
with services having in this way an easy structure to manage.
Diagram 4: Stop Sensor method activity sequence diagram
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 51
4.7 Test and Extensibility
4.7.1 Testing on the real device versus the simulator
Before testing our framework on an actual device, we have tried to simulate the motion
and to retrieve the sensor data using the OpenIntents SensorSimulator (see figure 14).
This simulator also lets you simulate the battery level and GPS position using a Telnet
connection. It currently supports accelerometer, compass, orientation, temperature,
light, proximity, pressure, gravity, linear acceleration, rotation vector and gyroscope
sensors, where the behavior can be customized through various settings.
Figure 14: SensorSimulator overview (Source: OpenIntents)
Although, the emulator does not represent the specific implementation of the Android
platform that is unique to a given device. It does not use the same hardware to
determine signal, networking, or location information [23].
Even though this method helps involving sensors not embedded in most common
devices, we have chosen to focus on an actual device functional testing.
The MainActivity class is the key execution component of SENSOROID framework that
represents a single screen with a simple user interface. This class helps verifying the
detailed design of the framework on a real Android mobile device.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 52
When the user selects the SENSOROID app icon from the Home screen (see figure
15), the system calls the onCreate() method for the Activity in the app that is declared to
be the "launcher" (or "main") activity. This is the activity that serves as the main entry
point to the app's user interface.
Figure 15: Real device testing UI
The buttons, Start Light Sensor and Stop Light Sensor call the functions
startSensor(SensorType.Light) and stopSensor(SensorType.Light), respectively. After
that the screen toasts multiple event values retrieved from the device’s sensor. To show
how an Activity can cooperate with the framework below is another class diagram, but
this time with Main Activity class. As we explained before in Activity class the developer
can implement the Sensor Interface and the Custom Data Listener and make further
extensions work with Android sensors.
Developing a sensing framew ork for mobile devices
A.Gabaraeva , I.Iliadhi 53
Figure 16: Main Activity Class Diagram
To check how many sensors a specific testing device can possibly have, the application
name PhoneTester comes very heplful (see figure 16).
Figure17: Phone Tester Application UI
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices
Developing a sensing framework for mobile devices

More Related Content

Similar to Developing a sensing framework for mobile devices

Iot attendance system using fingerprint module
Iot attendance system using fingerprint module Iot attendance system using fingerprint module
Iot attendance system using fingerprint module AjinkyaMore29
 
Mphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internshipMphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internshipNeha Yadav
 
Mphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internshipMphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internshipNeha Yadav
 
Iris based Human Identification
Iris based Human IdentificationIris based Human Identification
Iris based Human Identificationdswazalwar
 
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People
IRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind People
IRJET - For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET Journal
 
PSFK Future Of Wearable Tech Report
PSFK Future Of Wearable Tech ReportPSFK Future Of Wearable Tech Report
PSFK Future Of Wearable Tech ReportPSFK
 
Future of Wearable Tech PSK
Future of Wearable Tech PSKFuture of Wearable Tech PSK
Future of Wearable Tech PSKJosh Trent
 
Future of Wearable Tech Report
Future of Wearable Tech ReportFuture of Wearable Tech Report
Future of Wearable Tech ReportIntel iQ
 
Gridforum David De Roure Newe Science 20080402
Gridforum David De Roure Newe Science 20080402Gridforum David De Roure Newe Science 20080402
Gridforum David De Roure Newe Science 20080402vrij
 
Enisa report guidelines for securing the internet of things
Enisa report   guidelines for securing the internet of thingsEnisa report   guidelines for securing the internet of things
Enisa report guidelines for securing the internet of thingsnajascj
 
Ambient intelligence - an overview
Ambient intelligence - an overviewAmbient intelligence - an overview
Ambient intelligence - an overviewFulvio Corno
 
Obstacle Detection and Navigation system for Visually Impaired using Smart Shoes
Obstacle Detection and Navigation system for Visually Impaired using Smart ShoesObstacle Detection and Navigation system for Visually Impaired using Smart Shoes
Obstacle Detection and Navigation system for Visually Impaired using Smart ShoesIRJET Journal
 
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaContents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaGovt. P.G. College Dharamshala
 
ACIS Annual Report 2014
ACIS Annual Report 2014ACIS Annual Report 2014
ACIS Annual Report 2014Ralf Klamma
 
PPT ON INTERNET OF THINGS.pptx
PPT ON INTERNET OF THINGS.pptxPPT ON INTERNET OF THINGS.pptx
PPT ON INTERNET OF THINGS.pptxSwagatoBiswas
 
An HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf CommunityAn HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf CommunityIJEACS
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyesRoshmi Sarmah
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyesRoshmi Sarmah
 
Future of Wearable Tech 2014 (PSFK, IQ Intel)
Future of Wearable Tech 2014 (PSFK, IQ Intel)Future of Wearable Tech 2014 (PSFK, IQ Intel)
Future of Wearable Tech 2014 (PSFK, IQ Intel)Vasily Ryzhonkov
 

Similar to Developing a sensing framework for mobile devices (20)

Iot attendance system using fingerprint module
Iot attendance system using fingerprint module Iot attendance system using fingerprint module
Iot attendance system using fingerprint module
 
Mphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internshipMphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internship
 
Mphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internshipMphasis ppt on internet of things for internship
Mphasis ppt on internet of things for internship
 
Sanni Siltanen: Developing augmented reality solutions through user involveme...
Sanni Siltanen: Developing augmented reality solutions through user involveme...Sanni Siltanen: Developing augmented reality solutions through user involveme...
Sanni Siltanen: Developing augmented reality solutions through user involveme...
 
Iris based Human Identification
Iris based Human IdentificationIris based Human Identification
Iris based Human Identification
 
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People
IRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind People
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People
 
PSFK Future Of Wearable Tech Report
PSFK Future Of Wearable Tech ReportPSFK Future Of Wearable Tech Report
PSFK Future Of Wearable Tech Report
 
Future of Wearable Tech PSK
Future of Wearable Tech PSKFuture of Wearable Tech PSK
Future of Wearable Tech PSK
 
Future of Wearable Tech Report
Future of Wearable Tech ReportFuture of Wearable Tech Report
Future of Wearable Tech Report
 
Gridforum David De Roure Newe Science 20080402
Gridforum David De Roure Newe Science 20080402Gridforum David De Roure Newe Science 20080402
Gridforum David De Roure Newe Science 20080402
 
Enisa report guidelines for securing the internet of things
Enisa report   guidelines for securing the internet of thingsEnisa report   guidelines for securing the internet of things
Enisa report guidelines for securing the internet of things
 
Ambient intelligence - an overview
Ambient intelligence - an overviewAmbient intelligence - an overview
Ambient intelligence - an overview
 
Obstacle Detection and Navigation system for Visually Impaired using Smart Shoes
Obstacle Detection and Navigation system for Visually Impaired using Smart ShoesObstacle Detection and Navigation system for Visually Impaired using Smart Shoes
Obstacle Detection and Navigation system for Visually Impaired using Smart Shoes
 
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania SusheelaContents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
Contents of Internet of Things(IoT) By Thakur Pawan & Pathania Susheela
 
ACIS Annual Report 2014
ACIS Annual Report 2014ACIS Annual Report 2014
ACIS Annual Report 2014
 
PPT ON INTERNET OF THINGS.pptx
PPT ON INTERNET OF THINGS.pptxPPT ON INTERNET OF THINGS.pptx
PPT ON INTERNET OF THINGS.pptx
 
An HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf CommunityAn HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf Community
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyes
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyes
 
Future of Wearable Tech 2014 (PSFK, IQ Intel)
Future of Wearable Tech 2014 (PSFK, IQ Intel)Future of Wearable Tech 2014 (PSFK, IQ Intel)
Future of Wearable Tech 2014 (PSFK, IQ Intel)
 

Recently uploaded

Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfsudhanshuwaghmare1
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProduct Anonymous
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsJoaquim Jorge
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyKhushali Kathiriya
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Miguel Araújo
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...Martijn de Jong
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingEdi Saputra
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024The Digital Insurer
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationSafe Software
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘RTylerCroy
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century educationjfdjdjcjdnsjd
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsRoshan Dwivedi
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?Igalia
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)wesley chun
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...Neo4j
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024The Digital Insurer
 

Recently uploaded (20)

Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemkeProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
ProductAnonymous-April2024-WinProductDiscovery-MelissaKlemke
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost SavingRepurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
Repurposing LNG terminals for Hydrogen Ammonia: Feasibility and Cost Saving
 
Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024Axa Assurance Maroc - Insurer Innovation Award 2024
Axa Assurance Maroc - Insurer Innovation Award 2024
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live StreamsTop 5 Benefits OF Using Muvi Live Paywall For Live Streams
Top 5 Benefits OF Using Muvi Live Paywall For Live Streams
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...Workshop - Best of Both Worlds_ Combine  KG and Vector search for  enhanced R...
Workshop - Best of Both Worlds_ Combine KG and Vector search for enhanced R...
 
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law DevelopmentsTrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
TrustArc Webinar - Stay Ahead of US State Data Privacy Law Developments
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 

Developing a sensing framework for mobile devices

  • 1. NATIONAL AND KAPODISTRIAN UNIVERSITY OF ATHENS FACULTY OF SCIENCES DEPARTMENT OF INFORMATICS AND TELECOMMUNICATIONS BACHELOR THESIS Developing a sensing framework for mobile devices Alina J. Gabaraeva Iliona D. Iliadhi Supervisors: Alonistioti Athanasia, Assistant Professor Dimitrios Soukaras , Scientific Associate ΑTHENS March 2015
  • 2. ΕΘΝΙΚΟ ΚΑΙ ΚΑΠΟΔΙΣΤΡΙΑΚΟ ΠΑΝΕΠΙΣΤΗΜΙΟ ΑΘΗΝΩΝ ΣΧΟΛΗ ΘΕΤΙΚΩΝ ΕΠΙΣΤΗΜΩΝ ΤΜΗΜΑ ΠΛΗΡΟΦΟΡΙΚΗΣ ΚΑΙ ΤΗΛΕΠΙΚΟΙΝΩΝΙΩΝ ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ Ανάπτυξη ενός πλαισίου ανίχνευσης αισθητήρων για κινητές συσκευές Alina J. Gabaraeva Iliona D. Iliadhi Επιβλέποντες: Αλωνιστιώτη Αθανασία , Επίκουρος Καθηγητής Δημήτριος Σουκαράς, Επιστημονικός Συνεργάτης ΑΘΗΝΑ Μάρτιος 2015
  • 3. BACHELOR THESIS Developing a sensing framework for mobile devices Iliona D. Iliadhi Α.Μ.: 1115200900180 Alina J. Gabaraeva Α.Μ.: 1115200800222 SUPERVISORS: Alonistioti Athanasia, Assistant Professor Dimitrios Soukaras, Scientific Associate
  • 4. ΠΤΥΧΙΑΚΗ ΕΡΓΑΣΙΑ Ανάπτυξη ενός πλαισίου ανίχνευσης αισθητήρων για κινητές συσκευές Iliona D. Iliadhi Α.Μ.: 1115200900180 Alina J. Gabaraeva Α.Μ.: 1115200800222 ΕΠΙΒΛΕΠΟΝΤΕΣ: Αλωνιστιώτη Αθανασία, Επίκουρος Καθηγητής Δημήτριος Σουκαράς, Επιστημονικός Συνεργάτης
  • 5. ABSTRACT IoT is an acronym for Internet of Things. Much of the Internet of Things (IoT) is mobile, and compact sensors play a key role in delivering status information for the connected devices. For this reason Internet of Things can be thought of as the Internet of sensors and actuators. This paper describes the development of a sensing and data processing framework for accessing sensor values on Android mobile devices and making them available to other applications. Our goal is to bring the sensing capabilities of smartphone devices to IoT architectures. Implementing this mobile sensing framework we intend to facilitate the interface between the IoT system, the user application and device hardware drivers. This work will help developers to focus on writing minimal pieces of sensor-specific code enabling an ecosystem of reusable sensor drivers. We describe the challenges and requirements for such a framework, outline an architecture and report on our experience with an implementation running on top of the Android platform. SUBJECT AREA: Internet of Things KEYWORDS: phone sensing, framework, sensors, android mobile computing
  • 6. ΠΕΡΙΛΗΨΗ IoT είναι ένα αρκτικόλεξο για το Ίντερνετ των πραγμάτων. Μεγάλο μέρος του Ίντερνετ των πραγμάτων (IoT) είναι τα κινητά, κυρίως οι ενσωματομένοι αισθητήρες έχουν το σημαντικότερο ρόλο στην παροχή πληροφοριών για την κατάσταση των συνδεδεμένων συσκευών. Για το λόγο αυτό το Ίντερνετ των πραγμάτων μπορεί να θεωρηθεί ως το Διαδίκτυο των αισθητήρων. Αυτή η πτυχιακή εργασία περιγράφει την ανάπτυξη ενός πλαισίου ανίχνευσης και επεξεργασίας δεδομένων και την πρόσβασή τους από φορητές συσκευές Android ώστε να χρησιμοποιηθούν σε άλλες εφαρμογές. Στόχος μας είναι να φέρουμε τις δυνατότητες ανίχνευσης των έξυπνων κινητών συσκευών σε IoT αρχιτεκτονικές. Με την υλοποιήση αυτού του προγράμματος σκοπεύουμε να διευκολυνθεί η σύνδεση μεταξύ των IoT συστημάτων, την διεπαφή του χρήστη και του υλικού κινιητών συσκευών. Το πλαίσιο αυτό θα βοηθήσει τους προγραμματιστές στο γράψιμο ελάχιστου κώδικα σχετικά με τους αισθητήρες έτσι ώστε να είναι δυνατή η επαναχρησιμοποιήσή τους. Περιγράφοντας τις προκλήσεις αλλα και τις απαιτήσεις για ένα τέτοιο πλαίσιο, σκιαγράφουμε μία αρχιτεκτονική και αναφέρουμε την εμπειρία μας, με την εφαρμογή αυτή που στηρίζεται στην βάση της Android πλατφόρμας. ΘΕΜΑΤΙΚΕΣ ΠΕΡΙΟΧΕΣ: Ίντερνέτ των πραγμάτων ΛΕΞΕΙΣ ΚΛΕΙΔΙΑ: πλαίσιο ανίχνευσης, αισθητήρες, κινητά, ανάπτυξη λογισμικού Android
  • 7. To my loved ones. -Alina To my family that supported me in this journey. -Iliona
  • 8. AKNOWLEDGMENTS For the fulfillment of this thesis, we would like to thank our supervisors, Ms. Nancy Alonistioti and Mr. Dimitrios Soukara, for their precious help and guidance. Their constructive comments and their assistance in the design of the framework were very valuable for our thesis. Για τη διεκπεραίωση της παρούσας Πτυχιακής Εργασίας, θα θέλαμε να ευχαριστήσουμε τους επιβλέποντές μας, την κυρία Αθανασία Αλωνιστιώτη και τον κύριο Δημήτρη Σουκαρά, για την βοήθειά και την καθοδήγησή τους. Τα εποικοδομητικά τους σχόλια καθώς και η συνδρομή τους στο σχεδιασμό του πλαισίου ήταν πολύτιμα για την πτυχιακή μας εργασία.
  • 9. TABLE OF CONTENTS PREFACE......................................................................................................................................13 1. INTRODUCTION..................................................................................................................14 1.1 Motivation ..........................................................................................................................................................................14 1.2 A brief summary of the subject ......................................................................................................................................14 1.3 Paper Roadmap..................................................................................................................................................................15 2. BACKGROUND .....................................................................................................................16 2.1 Overview of Internet of Things......................................................................................................................................16 2.1.1 IoT definition..............................................................................................................................................................16 2.1.2 Evolution of the Internet of Things ........................................................................................................................16 2.1.3 Use case scenarios and the future .........................................................................................................................18 2.2 IoT as an internet of sensors ...........................................................................................................................................19 2.3 Sensing with mobile phones ...........................................................................................................................................22 2.3.1 Smartphones as a sensing platform.......................................................................................................................22 2.3.2 What is a sensing framework? ................................................................................................................................24 2.3.3 IoT applications using mobile phone based sensing...........................................................................................25 3. ANDROID AS A MOBILE OPERATING SYSTEM...............................................................26 3.1 Android ................................................................................................................................................................................26 3.1.1 Background and history............................................................................................................................................26 3.1.2 Architecture................................................................................................................................................................26 3.1.3 Android SDK................................................................................................................................................................27 3.1.4 Application features..................................................................................................................................................28 3.2 Android and the Internet of Things ...............................................................................................................................29 3.3 Choosing Android as the Implementation Platform ..................................................................................................29 4. ANDROID SENSING FRAMEWORK...................................................................................31 4.1 Framework Description....................................................................................................................................................31 4.1.1 Core functionality - Overview..................................................................................................................................31
  • 10. 4.1.2 Implementation environment.................................................................................................................................32 4.2 SENSOROID Framework Fundamentals........................................................................................................................32 4.2.1 Framework components ..........................................................................................................................................32 4.2.2 Intents..........................................................................................................................................................................35 4.2.3 Context........................................................................................................................................................................36 4.2.4 Listeners ......................................................................................................................................................................36 4.2.5 Android Manifest.......................................................................................................................................................37 4.2.6 Sensors and Location................................................................................................................................................37 4.3 Development Requirements and Constraints .............................................................................................................43 4.4 Framework Architectural Design ...................................................................................................................................44 4.5 Framework Development Analysis................................................................................................................................44 4.5.1 Framework Interface ................................................................................................................................................45 4.5.2 Sensor Interface Implementation...........................................................................................................................45 4.5.3 Sensor Manager via Services...................................................................................................................................45 4.5.4 Gathering Sensor Data..............................................................................................................................................46 4.6 Data Flow Modulation......................................................................................................................................................48 4.7 Test and Extensibility........................................................................................................................................................51 4.7.1 Testing on the real device versus the simulator ..................................................................................................51 4.7.2 SENSOROID extensibility.........................................................................................................................................54 5. CONCLUSIONS......................................................................................................................55 ABBREVIATIONS AND ACRONYMS.........................................................................................56 APPENDIX A................................................................................................................................57 REFERENCES...............................................................................................................................66
  • 11. TABLE OF DIAGRAMS Diagram 1: Custom Data Listener Interface activity sequence diagram ...........................47 Diagram 2: SENSOROID class diagram ................................................................................48 Diagram 3: Start Sensor method activity sequence diagram .............................................49 Diagram 4: Stop Sensor method activity sequence diagram ..............................................50
  • 12. TABLE OF FIGURES Figure 1: Internet connected-devices and the future evolution (Source: Cisco 2011) ...16 Figure 2 : How a sensor works (from left to right), Source: Carré & Strauss.......................20 Figure 3 : Range of Sensors in IoT (Source: Reproduced from “Practical Electronics for Inventors”, Paul Scherz and Simon Monk) ................................................................................21 Figure 4: Android Architecture (Source: blog.ameykelkar.com) .........................................27 Figure 5: Global smartphone operating system market .......................................................30 Figure 6: SENSOROID framework structure .........................................................................31 Figure 7: Android Activity Lifecycle (Source: Android Developers) ....................................33 Figure 8:Android Service lifecycle (Source: Android Developers)......................................34 Figure 9: SENSOROID implemented sensors....................Error! Bookmark not defined. Figure 10: Accelerometer axes (Source: Android Developers)...........................................39 Figure 11: Android location components (Source: [12]) .......................................................42 Figure 12: SENSOROID architecture of all levels .................................................................44 Figure 13: Sensor Manager interaction with sensor hardware ...........................................46 Figure 14: SensorSimulator overview (Source: OpenIntents) .............................................51 Figure 15: real device testing UI ..............................................................................................52 Figure 16 : Main Activity Class Diagram.................................................................................53 Figure17: Phone Tester Application UI...................................................................................53
  • 13. PREFACE This text was written in Athens in 2015 to support our graduation thesis. The idea of the subject discussed in this paper was given because of the need of implementing a mobile sensing framework to use within the context of the Internet of Things. Therefore, it was a challenge for us to make a research upon this innovative concept in technology, especially given that there are insufficient research resources and supporting materials about it.
  • 14. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 14 1. INTRODUCTION 1.1 Motivation Mobile devices can create, share, and sync everything we want despite of distance specifications. Smartphones are fast becoming a ubiquitous computing platform. The statistics show that the total number of mobile phones shipped worldwide by the first quarter of 2014 was over 448,6 million devices [IDE]. The worldwide smartphone market grew 27.2% year over year in the second quarter of 2014. By 2017, 87% of the worldwide smart, connected device market will be tablets and smartphone, with PCs (both desktop and laptop) being 13% of the market [IDE]. These new mobile devices are programmable and come with a growing set of pre- installed powerful embedded sensors with multiple abilities for detecting GPS positions, directional accelerations, rotational vectors, device proximities, temperatures, ambient light conditions, etc. These sensors provide context-aware solutions and help the creation of a new level of sensor based applications in health, entertainment, access control, security, energy efficiency, home monitoring and home care. Smart phones, or any synchronous mobile device run on various mobile operating systems, some of them even on two. The most common of them are Android, iOS for Apple smartphones or tablets, Windows Phone, etc. With such variety of operating systems come great the need to develop a generic framework that can retrieve the values of various sensor types despite of the OS the smartphone is running. In this paper, we are presenting SENSOROID, a new framework that provides context- rich data that streams collected from android smartphones. Our architecture supports a service model, built on the Android platform it can be used by Java developers for integrating contextual data. 1.2 A brief summary of the subject We live in a connected world. Nearly two billion people connect to the Internet, share information and communicate over blogs, Wikis, social networks and a host of other media. By 2020 there will be fifty billion units connected and this concept is known globally as the “Internet of Things”. It involves people and objects being able to connect to the Internet anytime and anywhere and share information about their behavior. If we were to summarize the benefits of this grand vision it would be to make them all into a platform where data can be communicated, collated, analyzed and converted into useful information, in a secure way, which aims to make our lives better. It is already here, miniaturization and other technological advances already make it possible to instrument and connect virtually any object. With the IoT, small sensors are being integrated into real-world objects, acting as instruments that offer information about almost everything that can be measured. So the main factor that makes possible machine to machine communication (M2M) is sensing. [1]
  • 15. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 15 In this paper, we will discuss about a new promising research area called mobile sensing. It promotes completely decentralized sensing based on smartphone capabilities only. Recent evolutions in smartphones, such as Android and iOS, are broadening the traditional concept of the mobile device to provide not only computing resources, but also sensing capabilities, such as built-in sensors. These new features make mobile devices powerful and complete sensing platforms to continuously watch and monitor the behavior of users who move and act in the physical world bringing with them their mobile devices. On the other hand, developing mobile sensing applications is not widely used mainly because there are still several open technical issues. Different devices and platforms such as Android and iOS use very different interfaces into their sensors; privacy is another issue because of the amount and importance of sensed data and also monitoring tasks require intensive use of hardware sensors. In other words, can reduce battery lifetime and should be carefully managed. Therefore, we propose SENSOROID, a generic sensor reading framework that retrieves data from actual sensor drivers and provides a generic outbound interface. This mobile Android framework for mobile sensing aims to offer IoT app developers a set of attractive facilities and functions to quickly and easily design their own mobile sensing services. Up until now, it has been a challenging task for software developers (especially scientists and experimenters) to implement specialized sensor applications. In our current implementation, available for Android, most of the typical built-in Android sensors are already implemented. The sensing framework implementation is presented below into the text with details. The description proposes the framework architectural design, the framework interfaces that offer the potential of sensor access and data gathering for further elaboration from developers. We also designed the data flow model for a better understanding of the classes and the communication among them. This sensing framework will accept improvement and other extending features in the future easily accomplishing our goal to keep abstractness in all of its levels. 1.3 Paper Roadmap The text below is structured as follows: in Chapter 2, the theoretical background is presented, as well as some explanations about basic concepts related to our research area; in Chapter 3, the basic characteristics and architecture of Android have been presented, thus we are supporting our choice of mobile operating system platform; in Chapter 4, we are covering the basic characteristics of our sensing framework, while in Chapter 5 we outline some conclusions.
  • 16. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 16 2. BACKGROUND 2.1 Overview of Internet of Things 2.1.1 IoT definition When it comes to defining Internet of Things as a new concept in technology, there are a lot of different opinions, but the basic idea is: The Internet of Things links objects of the real world with the virtual world, enabling connection between “things” anytime, anyplace, with anything and anyone ideally using any network and any service. In its most technical sense, refers to an infrastructure in which billions of sensors embedded in common, everyday devices – “things” as such, or things linked to other objects or individuals – are designed to record, process, store and transfer data and, as they are associated with unique identifiers, interact with other devices or systems using networking capabilities. Through wired connections objects obtain intelligence by cooperating and share information about themselves to reach common goals as making a smarter world. A world where the real, digital and the virtual are converging to create smart environments that make energy, transport, cities and many other areas more efficient [2]. 2.1.2 Evolution of the Internet of Things The number of Internet-connected devices surpassed the number of human beings on the planet in 2011, and by 2020, Internet-connected devices are expected to number between 26 billion and 50 billion. For every Internet connected PC or handset there will be 5–10 other types of devices sold with native Internet connectivity. So the today well- known Internet of PC’s will move toward the Internet of Things soon. Figure 1: Internet connected-devices and the future evolution (Source: Cisco 2011)
  • 17. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 17 The IoT-idea is not new. The concept of "Internet of Things" came into light in 2005 when the International Telecommunications Union published the first report on the subject. However, IoT’s roots can be traced back to the Massachusetts Institute of Technology (MIT), from work at the Auto-ID Center. Founded in 1999, this group was working in the field of networked radio frequency identification (RFID) and emerging sensing technologies. The labs consisted of seven research universities located across four continents. These institutions were chosen by the Auto-ID Center to design the architecture for IoT. It only recently became relevant to the practical world, mainly because of the progress made in hardware development in the last decade. The decline of size, cost and energy consumption, hardware dimensions that are closely linked to each other, now allows the manufacturing of extremely small and inexpensive low-end computers [3]. The IoT application space is very diverse and IoT applications serve different users under three key categories - Consumers, Communities and Enterprises. There exist numerous opportunities for IoT applications to leverage the changing dynamics of societal trends (health and wellness, transport and mobility, security and safety, energy and environment, communication and e-society) and market trends (consumer electronics, automotive electronics, medical applications, communication, etc.) [4]. Due to a lack of standardization and interoperability, the Internet of Things is sometimes seen as an “Intranet of Things” in which every manufacturer has defined its own set of interfaces and data format. Data is then hosted in walled environments, which effectively prevents users from transferring (or even combining) their data from one device to another. Besides some well-known embryonic applications (Arduino, Nabaztag, Pachube, Touchatag, etc.), today objects can only exchange information within "intranets of things". These objects cannot yet address, any Internet of Things, which by definition should be open, uncertain and complex. One of the main challenges of the Internet of Things is therefore to transform connected objects into real actors of the Internet. Yet, smartphones and tablets have become the natural gateways of data collected through many IoT devices to the internet. As a result, manufacturers have progressively developed platforms that aim to host the data collected through such different devices, in order to centralize and simplify their management. Many large technology companies are involved in governing the IoT ecosystem through various initiatives and efforts around standardization. There are over 14 industry bodies working on developing IoT standards, some of the leading ones include Thread (led by Google), AllSeen Alliance (led by Qualcomm) and Industrial Internet Consortium (IIC, led by GE) [4]. Over the next 10 to 15 years, the Internet of Things is likely to develop fast and shape a newer "information society" and "knowledge economy", but the direction and pace with which developments will occur are difficult to forecast. The maturity of IoT is long way ahead, which will be characterized by rapid innovation, disruption and continuous evolution.
  • 18. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 18 2.1.3 Use case scenarios and the future Internet of Things has a very wide applicability in many areas. Today, several of IoT scenarios are very end-user oriented and we will analyze three specific IoT developments (Wearable Computing, Quantified Self and domotics) which are directly interfaced to the user and correspond to devices and services that are actually in use.  Wearable Computing Wearable Computing refers to everyday objects and clothes, such as watches and glasses, in which sensors were included to extend their functionalities. Wearable things are likely to be adopted quickly as they extend the usefulness of everyday objects which are familiar to the individual – all the more so as they can hardly be differentiated from their unconnected look-alikes. They may embed cameras, microphones and sensors that can record and transfer data to the device manufacturer. Furthermore, the availability of an API for wearable devices (e.g. Android Wear3) also supports the creation of applications by third parties who can thus get access to the data collected by those things [5].  Quantified Self Quantified Self things are designed to be regularly carried by individuals who want to record information about their own habits and lifestyles. For example, an individual may want to wear a sleep tracker every night to obtain an extensive view of sleep patterns. Other devices focus on tracking movements, such as activity counters which continuously measure and report quantitative indicators related to the individual’s physical activities, like burned calories or walked distances, among others. Some objects further measure weight, pulse and other health indicators. By observing trends and changes in behavior over time, the collected data can be analyzed to infer qualitative health-related information, including assessments on the quality and effects of the physical activity based on predefined thresholds and the likely presence of disease symptoms, to a certain extent. Quantified Self sensors are often required to be worn in specific conditions to extract relevant information. For example, an accelerometer placed on the belt of a data subject, with the appropriate algorithms, could measure the abdomen moves (raw data), extract information about the breathing rhythm (aggregated data and extracted information) and display the level of stress of the data subject (displayable data). On some devices, only this latter information is reported to the user, but the device manufacturer or the service provider may have access to much more data that can be analyzed at a later stage. Quantified Self is challenging with regard to the types of data collected that are health- related, hence potentially sensitive, as well as to the extensive collection of such data.
  • 19. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 19 It focuses on motivating users to remain healthy, it has many connections with the e- health ecosystem. Yet, recent investigations have challenged the real accuracy of the measures and of the inferences made from them [5].  Home automation (“domotics”) Today, IoT devices can also be placed in offices or homes such as “connected” light bulbs, thermostats, smoke alarms, weather stations, washing machines, or ovens that can be controlled remotely over the internet. For instance, things containing motion sensors can detect and record when a user is at home, what his/her patterns of movement are, and perhaps trigger specific pre-identified actions (e.g. switching on a light or altering the room temperature). Most home automation devices are constantly connected and may transmit data back to the manufacturer [5]. From their research Bosch anticipates that the majority of IoT connected devices in 2022 will be concentrated mainly in four industries – intelligent buildings, automotive, healthcare, and utilities. It has a huge market potential, but also a lot of challenges for the future. Some of them that need to be addressed include how to exchange data in a secure way between devices, how to store and process huge amounts of data, and how to protect their privacy. This requires enabling technologies regarding databases because an amount of big data will be produced. This is of course related to the scale of cloud computing. The formula for success of the IoT includes the standardization of communication between IoT middleware solutions and the IoT market in order to create large and successful ecosystems. 2.2 IoT as an internet of sensors While we talked about the Internet of Things evolving challenges about wireless and cloud technology, data storage and their privacy the IoT wouldn’t be possible without sensors. They are necessary to turn billions of objects into data-generating “things” that can report on their status, and in some cases, interact with their environment. In this context, we can say that sensors play a key role in the technological advances in this field.
  • 20. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 20  Sensors, a key role in IoT Sensors are devices capable to detect and measure changes in position, temperature, light, etc. and turn them into electrical variables. Sensors are a bridge between the physical world and the internet. They literally are the equivalent of human senses – sight, hearing, touch, smell and taste. We need them to get physical data from “things” and once is turned into an electrical equivalent it is easy to input them into a computer for manipulating, analyzing and displaying it. In this way we can process data to make smarter solutions and improving the quality of life. Sensors are used almost everywhere in electronic products nowadays also we can find them in a wide variety of applications, such as smart mobile devices, automotive systems, industrial control, healthcare, oil exploration and climate monitoring. There are many types of sensors: chemical, magnetic, mechanical, position, pressure, temperature, CCD and CMOS image sensors, motion sensing, RFID etc. One form of sensor technology is radio frequency identification (RFID), a method of identifying distinct items using radio waves. RFID is based on tags that contain microscopic chips used to store information about the item to which it is attached. Each year, hundreds of millions of sensors are manufactured. The application of nanotechnology to sensors should allow improvements in functionality leading to much decreased size, enabling the integration of ‘nanosensors’ into many other devices [6]. Advances in technologies as WSN (wireless sensor networks) and sensor fusion lead to a tremendous expansion in the delivery of context-aware services customized for any given situation. To respond to the need of Internet of Things vision having tiny wireless sensors all around us the sensor development technology should also solve the problem of energy consumption by making them self-sufficient and not the today power-needy devices. Figure 2 : How a sensor works (from left to right), Source: Carré & Strauss
  • 21. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 21 Figure 3 : Range of Sensors in IoT (Source: Reproduced from “Practical Electronics for Inventors”, Paul Scherz and Simon Monk)
  • 22. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 22 2.3 Sensing with mobile phones Mobile phone sensing is an emerging area of interest for researchers as smart phones are becoming not only the key computing and communication device, but a rich set of embedded sensors which collectively enable new applications being the natural gateways of data collected through many IoT devices to the internet. 2.3.1 Smartphones as a sensing platform Phone manufacturers never intended their devices to act as general purpose sensing devices. Sensors were only considered as tools to facilitate interaction with the phone. However, the mobile industry has started to change direction. In the near future is expected the release of a new hardware platform that facilitate background, sensing and new OS frameworks that incorporate a general purpose sensing middleware. The mobile phone is well on its way to becoming a personal sensing platform in addition to a communication device. Sensors are becoming more prevalent in mobile devices in recent years, making the mobile phone a sensor gateway for the individual. Phone sensing is about sensing human activities. Today’s top end mobile phones come up with a number of embedded specialized sensors, including ambient light sensor, accelerometer, digital compass, gyroscope, GPS, proximity sensor and general purpose sensors like microphone and camera. Sensing devices to smartphones provide the opportunity to track dynamic information in social networks, green applications, global environmental monitoring, personal and community health care and sensor augmented gaming, virtual reality and smart transportation systems. More and more organizations and people are discovering how mobile phones can be used for social impact, including how to use mobile technology for environmental protection, sensing, and to leverage real-time information to make our movements and actions more environmentally friendly. So what better tool than the mobile phone to really launch the Internet of Things? Mobile devices are already equipped with a wide range of sensors as we mentioned and it is also by definition connected to a network. This is one of the key drivers turning mobile on a sensing platform. Most of the smartphones on the market are open and programmable by third-party developers, and offer software development kits (SDKs), APIs, and software tools. Therefore, it’s easy to access sensor data and to leverage existing software to develop new sensing applications for the general purpose of IoT. These applications rely on advanced sensor information processes, which mainly involve raw data acquisition, feature extraction, data interpretation and transmission. The sensor data acquired is typically sent over the wireless communication channel (e.g., via Wi-Fi or a cellular network) after locally performing a set of stages to select relevant features, filter redundant information and controlling data transmission behavior through the deployment and enforcement of low-level decision policies. At each stage, different algorithmic solutions have been envisaged to perform learning and classification tasks, and their main requirements depend on the application type and its impact on CPU and battery components. An architectural model to discuss by phone sensing would consist of three main components: sense, learn, and share [7].
  • 23. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 23  Sense Individual mobile phones collect raw sensor data from sensors embedded in the phone.  Learn The raw sensor data from phones are worthless without interpretation. Information from the data collected is extracted with a variety of data mining and statistical tools. These operations could be used either directly on the phone, in the mobile cloud, or with some partitioning between the phone and the cloud. As such mobile is a powerful unit that can monitor continuously a user’s ambient context in real time.  Share Mobile phones are not limited to simply collecting sensor data. A number of phone sensing systems connect with existing web applications to either enrich existing applications or make the data more widely accessible. For example, a personal sensing application will only inform the user, whereas a group or community sensing application may share an aggregate version of information with the broader population and obfuscate the identity of the users. Other considerations are how to best visualize sensor data for consumption of individuals, groups, and communities. Data sharing helps to personalize sensing systems based on the individual user and groups of people with similar behavior [7]. Cell phones have become an indispensable tool not only for today’s highly mobile workforce, but also for the general people to transfer and exchange diverse mobile data. Thus, there are open challenges regarding mobile sensing systems including:  The sensing of people and their environment, context – awareness  The energy-efficient use of mobile device resources  The meaning and interpretation of mobile sensor data  Interactions with users, largely to provide feedback and information to users. Once these technical barriers will be over passed this new field will advance rapidly becoming a disruptive technology for a wide range of domains. If Internet of Things would be divided architecturally into layers we just talked about the sensing layer through smartphones. Data provided from the sensing layer are processed in the management layer. This layer is integral to the Internet of Things architecture and industry chain, integrating management, control, and operations on terminals and assets, including mobile assets. The management platform comprises the following software sets: integrated frameworks, Internet of Things middleware, industry suites, and industry application solutions [8].
  • 24. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 24 2.3.2 What is a sensing framework? Internet of Things frameworks might help support the interaction between "things" and allow for more complex structures like Distributed computing and the development of Distributed applications. Currently, Internet of Things frameworks seems to focus on real time data logging solutions offering some basis to work with many "things" and have them interact. Software Development leads to the specific cooperation of software with the hardware used in Internet of Things. When we talk about the interaction between “things” we mean sensing the physical world. There is no standardized method of obtaining data from sensors or for distributing this data to applications that use it. A sensing framework offers a standardized, easy-to-use, and efficient interface to sensor software developers. Sensing frameworks lets them write a sensor module focused on producing useful interpretations of the sensor data, without considering how the produced data will be retrieved from the hardware.  Mobile Sensing Framework For the mobile development interacting with the physical world is still a challenge. This happens due to the complexities of sensor data acquisition, context modeling, and data management. Contextual information extracted from the user’s environment can be used to enable an app to adapt its runtime behavior and capabilities to better fit a user’s changing situation and requirements. Also, several applications share the same needs as for the collected sensory data. Mobile sensing frameworks are flexible platforms to ease the development of mobile sensing applications through the definition of a common set of facilities that mask all low-level technical details in reading and processing raw sensor data. They are used by many independent developers to create software that must work together so the main goal of implementing them is that sensors be easy to write and easy to use and to provide a consistent interface for sensors across processors and platforms. At the same time frameworks must be flexible enough for developers to exploit all of the properties of the physical sensors and external data sources that make up the sensor modules. We can mention that since there is a need for unified sensing platforms because of the issues there is related work regarding mobile sensing frameworks. There are several of them such as Fünf Open Sensing framework, the Emotionsense framework, MSF and Purple Robot. These projects all collect contextual data using phone sensors and provide several data storage options, including the file-system, remote servers and cloud services such as Dropbox [9]. For the IoT implementing frameworks is useful because as we said new features make mobile devices powerful and complete sensing platforms to continuously watch and monitor the behavior of users who move and act in the physical world bringing with them their mobile devices. Moreover, it is possible to process on the mobile device large sets of locally collected raw data and to distill meaningful views of the activity currently done by the user, such as running, cycling, talking, and sitting. In brief, mobile devices can be converted to IoT helped by frameworks. Many mobile applications can exploit frameworks to make use of these brand-new mobile sensing capacities and span different areas, from healthcare to homecare, from safety to smart grids and environmental monitoring, and many more.
  • 25. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 25 2.3.3 IoT applications using mobile phone based sensing Now phones can be programmed to support new disruptive sensing applications such as sharing the user’s real-time activity with friends on social networks such as Facebook, keeping track of a person’s carbon footprint, or monitoring a user’s well being. Second, smartphones are open and programmable. In addition to sensing, phones come with computing and communication resources that offer a low barrier of entry for third-party programmers (e.g., undergraduates with little phone programming experience are developing and shipping applications). Third, importantly, each phone vendor now offers an app store, allowing developers to deliver new applications to large populations of users across the globe, which is transforming the deployment of new applications, and allowing the collection and analysis of data far beyond the scale of what was previously possible. Fourth, the mobile computing cloud enables developers to offload mobile services to back-end servers, providing unprecedented scale and additional resources for computing on collections of large-scale sensor data and supporting advanced features such as persuasive user feedback based on the analysis of big sensor data. These key factors lead to a high potential development for IoT applications deployment. There are three big categories we can divide mobile sensing applications:  Individual activity sensing: fitness applications, behavioral suggestions.  Group activity sensing: group to sense common activities and help achieving group goals. E.g.: neighborhood safety, collective recycling efforts.  Community sensing: large scale sensing, where a large number of people have the same application installed. E.g., tracking speed of disease across a city, congestion in the city [10]. Next, let’s mention the most important domains and examples.  Physical activity These kinds of applications use sensors as accelerometer, gyroscope and compass for activities as walking or running. Examples: Health-Calorie Tracking, Presence sharing  Transportation Mode Applications related to transportations use sensors as accelerometer, GPs, wifi for location specification. Mobile sensing systems as the MIT VTrack are being used to provide traffic information on a large scale for improving commute planning.  Healthcare The UbiFit Garden, a joint project between Intel and the University of Washington, captures levels of physical activity and relates this information to personal health goals when presenting feedback to the user.  Environment Conventional ways of measuring and reporting environmental pollution rely on aggregate statistics that apply to a community or an entire city. The University of California at Los Angeles (UCLA) PEIR project uses sensors in phones to build a system that enables personalized environmental impact reports, which track how the actions of individuals affect both their exposure and their contribution to problems such as carbon emissions [10].
  • 26. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 26 3. ANDROID AS A MOBILE OPERATING SYSTEM 3.1 Android 3.1.1 Background and history Android is described as a Linux based mobile operating system, initially developed by Android Inc. Later Android Inc. was acquired by Google 2005. Google, as well as other members of the Open Handset Alliance (OHA) collaborated on Android (design, development, and distribution). Currently, the Android Open Source Project (AOSP) is governing the Android maintenance and development cycle. [11] To reiterate, the Android operating system is based on a modified Linux 2.6 kernel [12]. Compared to a Linux 2.6 environment though, several drivers and libraries have been either modified or newly developed to allow Android to run as efficiently and as effectively as possible on mobile devices (such as smart phones or internet tablets). Some of these libraries have their roots in open source projects. Due to some licensing issues, the Android community decided to implement their own c library (Bionic), and to develop an Android specific Java runtime engine (Dalvik Virtual Machine – DVM). With Android, the focus has always been on optimizing the infrastructure based on the limited resources available on mobile devices.To complement the operating environment, an Android specific application framework was designed and implemented. Therefore, Android can best be described as a complete solution stack, incorporating the OS, middle-wear components, and applications. 3.1.2 Architecture Figure 1 outlines the current (layered) Android Architecture. The modified Linux kernel operates as the HAL, and provides device driver, memory management, process management, as well as networking functionalities, respectively. The library layer is interfaced through Java (which deviates from the traditional Linux design). It is in this layer that the Android specific libs (Bionic) are located. The surface manager handles the user interface (UI) windows. The Android runtime layer holds the Dalvik Virtual Machine (DVM) and the core libraries (such as Java or IO). Most of the functionalities available in Android are provided via the core libraries. The application framework holds the API interface. In this layer, the activity manager governs the application life cycle. The content providers enable applications to either access data from other applications or to share their own data. The resource manager provides access to non-code resources (such as graphics), while the notification manager enables applications to display custom alerts. On top of the application framework are the built-in, as well as the user applications, respectively. It has to be pointed out that a user application can replace a built-in application, and that each Android application runs in its own process space, within its own DVM instance.
  • 27. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 27 Figure 4: Android Architecture (Source: Amey Kelkar) 3.1.3 Android SDK The Android Software Development Kit (SDK) is a set of development tools used, to develop applications for Android platform. The Android SDK has a modular structure, which means that the major components of the SDK are collected into separate packages. This makes it easy to install only the components you need for your particular unique use case. The packages you install are determined by the version of the OS you are targeting, if you use third-party services (like Google Maps or Analytics), and if you plan to support specific hardware (like a particular chipset or a dual screen). The modular structure has two important benefits. The first is that disk storage is not wasted on downloading unnecessary components. This is important because each platform requires at least 100MB of space, and this can grow rapidly when optional packages are included. The other advantage is that managing dependencies within a project is streamlined because it is possible to control exactly which software you are working with, and install only the components you require [13]. It is important to understand the various components that are available. They are organized into categories:  Required libraries  Debugger
  • 28. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 28  An emulator  Relevant documentation for the Android application program interfaces (APIs)  Sample source code  Tutorials for the Android OS As the SDK is free and easy to install and use across multiple platforms, it is attractive for the application developers to implement their own ideas leading to a very rich Android application market [14]. 3.1.4 Application features As mentioned before, Android applications are developed in the Java programming language. The Android Software Development Kit (SDK) along with any data and resource files—into an APK, compiles the application source code: an Android package, which is an archive file with an .apk suffix. These types of files are the ones that the Android devices use to install the applications. In most cases, every Android application runs in its own Linux process. Each process has its own virtual machine, not to interfere with each other. This process is created for the application when some of its code needs to be run, and will remain running until it is no longer needed and the system needs to reclaim its memory for use by other applications. An unusual and fundamental feature of Android is that application process's lifetime is not directly controlled by the application itself. Instead, it is determined by the system through a combination of the parts of the application that the system knows are running, how important these things are for the user, and how much overall memory is available in the system. To determine which processes should be killed when low on memory, Android places each process into an "importance hierarchy" based on the components running in them and the state of those components. [15] Android applications are mainly made of different essential building blocks, called application components. Each component defines a way to have an access to the application. Android defines four types of components: 1.Activities: Each activity corresponds to a screen of the application and implements the methods related to the layout. 
 2.Services: A service works in a similar way as an activity, but without a user interface. It is used to run long time background operations. 
 3.Content providers: This component is used to access or even modify data from outside the application. (e.g. to access and/or modify the contact information stored in the device) 
 4.Broadcast Receivers: It is a component that can be executed when a system-wide broadcast message is sent from any point of the system. It can be used to wake up the application of a specific event (e.g. for a battery state change or time change) [16].
  • 29. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 29 3.2 Android and the Internet of Things Android will significantly help original equipment manufacturers (OEM) in rolling out IoT devices. Intelligent IoT devices, that need to do more than report data (such as a personal health monitor device, similar to the size of a small phone, that say the aged may keep that collects and analyzes data from various body sensors), also powerful IoT devices, such as home appliances, watches, and car dashboards will benefit greatly from Android. And of course, one cannot avoid mentioning Google Glass when speaking of IoT. The key reasons why 'IoT devices' will use Android:  Android as a stable and free OS that has already been validated in the smartphone market as an OS capable of running on embedded systems.  As of early 2013, The Google Play marketplace lists over 800,000+ applications, which means there is a vibrant developer ecosystem building innovative solutions. If an IoT device uses Android as its base OS, it can automatically leverage these applications and you can be rest assured that developers will start building a slew of innovative apps on that platform. Android as a stable and free OS that has already been validated in the smartphone market as an OS capable of running on embedded systems.  You can customize Android as much as you want – it is possible to add your own features right into the OS to make it just right for your environment. At that point simply isn't another embedded OS that combines this flexibility along with having so many 3rd party apps and such widespread adoption.  Android already supports key communication protocols like NFC, Bluetooth Smart (low energy profile), WiFi-Direct and others, which are key for IoT devices.  Android already supports a wide variety of form factors and the APIs are tuned to give developers control on how to write apps for a wide variety of sizes (they factor in both screen sizes and pixel density).  Android apps are written in Java. Java resources are probably much easier to find and also it is easier to maintain Java Code from a cost perspective. Although, being able to adapt Android to your specific device needs is a big challenge for many OEMs. The Android SDK has excellent documentation. But if a 'custom ROM' or 'custom Image' of Android needs to be created by recompiling source, there is very little documentation on how to modify Android. Bottom line, while the promises of Android are many, it is also true that adapting Android to a device is also very complicated [14]. 3.3 Choosing Android as the Implementation Platform The choice of operating system is critical when designing a framework that will be deployed on multiple device types. Common mobile platforms include Blackberry, Windows Phone, Apple iOS, or Google Android as the embedded OS. According to a report from Strategy Analytics, Google’s Android continues to dominate the smartphone operating system market. It is important to mention, that Android has sophisticated application marketplaces, which provide easy access to both software distribution and maintenance channels. We choose Android as the target platform for the SENSOROID Sensing framework
  • 30. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 30 because it is open source and has extensive support for background processes and includes several built-in constructs for inter-application communication (IPC) between Android Applications. Figure 5: Global smartphone operating system market Additionally, Android supports multiple communications APIs that facilitate connecting to a wide variety of external sensors as was described in previous subsection [16].
  • 31. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 31 4. ANDROID SENSING FRAMEWORK 4.1 Framework Description 4.1.1 Core functionality - Overview Our objective of this work is to implement an easy-to-use sensing framework that developers and researchers can use to provide continuous sensing in Android applications. The model we are following provides sensor data to anyone who requires access to it on demand [17]. Sensing as a service model does not collect sensor data from all the available sensors at all times, but maintains track of the individual sensors, their accessibility, and capabilities. However, they do not collect sensor data, unless a developer makes a request [18]. Figure 6: SENSOROID framework structure
  • 32. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 32 Thus, the framework architecture should be modular and easy to use at two levels; application developers should be able to quickly create new applications based on raw data and/or already computed high-level inferences, which is freeing from understanding the specifics of the underlying communication between build-in sensors on Android devices; library developers should be capable to easily plug-in new components, such as support for new sensors and activity classifiers [9]. In particular, SENSOROID consists of multiple system services and Android sensor event listener, that collects sensor values using event handlers and sends them using custom listeners. The goal is to achieve a generic framework architecture implementation. 4.1.2 Implementation environment 4.1.2.1 Eclipse IDE Most people are aware of Eclipse as an integrated development environment (IDE) for Java. Even though it is known that Android Studio is now the official IDE for Android, we have chosen Eclipse as the implementation environment of the SENSOROID framework. Android offers a custom plugin for the Eclipse IDE, called Android Development Tools (ADT). This plugin provides a powerful, integrated environment in which to develop Android apps. It extends the capabilities of Eclipse to let you quickly set up new Android projects, build an application’s UI, debug your application, and export signed (or unsigned) application’s packages (APKs) for distribution. In our project we are using the latest ADT version, 23.0.4. 4.1.2.2 Application Program Interface and Test specifications SENSOROID was developed on Mac OS Yosemite (version 10.10.2) and Windows 8 using the Android Framework API level 21, targeting device platform versions 8 and newer. Two mobile smartphones were used to test the framework, running KitKat (4.4.2) and Jelly Bean (4.2.2) Android mobile operating systems. The Linux Kernel versions varied between devices tested, ranging from 3.4.0 to 3.4.5. 4.2 SENSOROID Framework Fundamentals 4.2.1 Framework components 4.2.1.1 Activities Activity is a type of Android component that is in charge of the user interface (UI) items management and user interaction event configuration. Almost all activities require interaction with the user, and for that reason, the activity takes care of creating the window and laying out the UI components [19]. Typically, one activity in an application is specified as the "main" activity, which is presented to the user when launching the application for the first time. Each activity can then start another activity in order to perform different actions. Each time a new activity starts, the previous activity is stopped, but the system preserves the activity in a stack (the "back stack"). When a new activity starts, it is pushed onto the back stack and takes user focus. The back stack abides to the basic "last in, first out" stack mechanism, so, when the user is done with the current activity and presses the Back button, it is popped from the stack (and destroyed) and the previous activity resumes. When an activity is stopped because a new activity starts, it is notified of this change in state through the activity's lifecycle callback methods.
  • 33. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 33 The activity life cycle is managed by the Activity Manager, a service that runs inside the Android Framework layer of the stack. It is responsible for creating, destroying, and managing activities. For example, when the user starts an application for the first time, the Activity Manager will create its activity and put it onto the screen [20]. To create a new activity, you must simply derive a new class from the android.app.Activity class. Seven life cycle states can be implemented (see Figure 7), such as onCreate(), onStart(), onRestart(), onResume(), onPause(), onStop(), onDestroy(). Figure 7: Android Activity Lifecycle (Source: Android Developers) At first, when activity starts, methods onCreate(), onStart() and onResume() are executed. onCreate() is called when the activity gets created. All activities implement this method in order to initialize the activity and its UI. If another activity comes into the foreground, the running activity goes onPause(). Paused activities still have high priority in terms of getting memory and other resources. If there is no enough memory for a foreground activity, the activity stored in the background will be destroyed. That means that if there is important information, it is better to save it while running the onPause()
  • 34. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 34 method. At the same time, if there is some information that needs to be reloaded or reconfigured, it should be done while running the onResume() method. At this state, the user can interact with the application. 4.2.1.2 Services A Service is an application component that can perform long-running operations in the background and does not provide a user interface. Another application component can start a service and it will continue to run in the background, even if the user switches to another application [15]. Services have a much simpler life cycle than activities (see Figure 8). You either start a service or stop it. Also, the service life cycle is more or less controlled by the developer, and not so much by the system [20]. Figure 8:Android Service lifecycle (Source: Android Developers) A service can essentially have two states, started and bound. A service is "bound" when an application component binds to it by calling bindService(). A service is "started" when an application component (such as an activity) starts it by calling startService(). Once started, a service can run in the background indefinitely, even if the component that started it is destroyed. Usually, a started service performs a single operation and does not return a result to the caller. When the operation is done, the service should stop itself [15].
  • 35. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 35 To allow the components to start the service some callback methods have to be implemented. onCreate() is called when the service is first created. onStartCommand() method is called every time the service is explicitly started through an intent. While the service is running, this method can be called multiple times by the system to queue more work for the service. onDestroy() is called when the service is no longer in use or being destroyed. The service should clean up any resources it holds. This is the last call the service receives. The service start strategy can be adjusted through the return value of the onStartCommand() method. In SENSOROID framework this method returns START_NOT_STICKY constant value, which tells the OS not to bother recreating the service again. 4.2.2 Intents Intents are messaging objects that are sent among the building blocks. They trigger an activity to start up, tell a service to start, stop, or bind to, or are simply broadcasts. Intents are asynchronous; meaning the code that sends them doesn’t have to wait for them to be completed. Applications use Intents for both inter-application communication and intra-application communication. Additionally, the operating system sends Intents to applications as event notifications. Some of these event notifications are system-wide events that can only be sent by the operating system. We call these messages system broadcast Intents [21]. Intent could be explicit or implicit. In an explicit intent, the sender clearly spells out which specific component should be on the receiving end. In an implicit intent, the sender declares a general action to perform, which allows a component from another app to handle it. When you create an implicit intent, the Android system finds the appropriate component to start by comparing the contents of the intent to the intent filters declared in the manifest file of other apps on the device. If the intent matches an intent filter, the system starts that component and delivers it the Intent object. If multiple intent filters are compatible, the system displays a dialog so the user can pick which app to use. An intent filter is an expression in an app's manifest file that specifies the type of intents that the component would like to receive. For instance, by declaring an intent filter for an activity, you make it possible for other apps to directly start your activity with a certain kind of intent. Likewise, if you do not declare any intent filters for an activity, then it can be started only with an explicit intent [15]. Although intents facilitate communication between components in several ways, there are three fundamental use-cases:  startActivity() Activities are started with Intents, by passing an Intent to startActivity().  startService() Intents are used to start and bind to Services, by passing an Intent to startService(). The Intent describes the service to start and carries any necessary data.  sendBroadcast()
  • 36. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 36 Broadcast Receivers receive Intents to multiple applications. They are triggered by the receipt of an appropriate Intent and then run in the background to handle the event. Intents can be sent between three of the four components: Activities, Services, and Broadcast Receivers. Intents can be used to start Activities; start, stop, and bind Services; and broadcast information to Broadcast Receivers. For a Service or Activity to receive Intents, it must be declared in the manifest. 4.2.3 Context Context is an abstract class whose implementation is provided by the Android system, it has access to global information about an application environment. Context helps the current activity to interact with local files, databases, class loaders associated with the environment, services including system-level services, and more. The context can be accessed by invoking getApplicationContext(), getContext(), getBaseContext() or this (in the activity class). Typical uses of context:  Creating New objects: new views, adapters, listeners  Accessing Standard Common Resources; Services like LAYOUT_INFLATER_SERVICE, Shared Preferences: context.getSystemService(), getApplicationContext().getSharedPreferences()  Accessing Components Implicitly. Regarding content providers, broadcasts, intents: getApplicationContext().getContentResolver().query(uri, ...). Finally, both the Activity and Service classes are subclasses of the Context class. The instances of them can be used wherever a context needed. In case of our framework, we were in need of accessing the application context outside the classes mentioned above. A new class called ApplicationContextProvider was created to support a generic architecture of the framework and to provide the context anywhere outside an Activity or Service. Also, it had to be declared in AndroidManifest.xml in the application tag, after that it can be accessed by simply calling ApplicationContextProvider.getContext(). 4.2.4 Listeners Listeners are the objects, which are watching the state changes. The SensorEventListener is an interface that provides the callbacks to alert an app to sensor- related events. To be made aware of these events, an app registers a concrete class that implements SensorEventListener with the SensorManager. SensorManager is the Android system service that gives an app access to hardware sensors. Like other system services, it allows apps to register and unregister for sensor- related events. Once registered, an app will receive sensor data from the hardware [34]. An application, or in our case a sensing framework, must implement SensorEventListener and contain implementation for both onSensorChanged() and onAccuracyChanged() to receive sensor data, also to extract data from SensorEvent depending on the sensor type, and ensure that an application/framework unregisters it at the right time. It is important to remember to unregister sensor listeners whenever they are not in use. Not doing so drains the battery and uses system resources, including the garbage collector. Android does not take care of this by itself when another Activity comes to the foreground or when the screen is turned off — it is in the hands of the app developer to
  • 37. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 37 control listeners wisely. If Android kills the application, however, it also unregisters listeners [22]. 4.2.5 Android Manifest This file contains information like Android version, the permissions; the main activity is also declared in the Manifest in order to be shown at first when the application is launched. The Manifest must contain all of the Android components (activities, services, content providers or broadcast receivers) that the application can use. Each component is related to the class that implements it and the conditions under which components are executed. The Android applications, by default, are not able to access any other part of the system outside the application components. To have an access to any system resource or content, the application must declare the required permissions in order to let the user be aware of the application scope. 4.2.6 Sensors and Location A fact that contributes in making Android powered devices really powerful is the several built in sensors they have. Android devices are commonly manufactured with several sensors that can be used in any application. The Android operating system is already designed to manage all these built in sensors with an easy to use framework for Android application development. As there are many possible sensors to be implemented on an Android device, they are divided in three main groups:  Motion sensors: These sensors measure the forces applied over the device, that means acceleration and rotation forces. The most known sensors are the accelerometer and the gyroscope.  Environmental sensors: It is the type of sensors that measure environmental parameters like temperature, light, humidity, etc. Typical environmental sensors are the barometers, thermometers and photometers.  Position sensors: Position sensors measure the parameters that determine the physical position of the device. The most common sensors are the magnetometer and the proximity sensor.
  • 38. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 38 Our framework provides access to 12 Android sensors (see figure 9), which values can be retrieved easily by a developer. Figure 9: SENSOROID implemented sensors 1. Sensor.TYPE_LIGHT The light sensor is a hardware sensor visible in front side of the device. It is a simple photodiode, which generates a voltage when light is incident on it. The light sensor measures the ambient light level (illumination) in Lux (lx) and has a dynamic range between 1 and 30,000 Lux. Mostly it is used to adjust screen brightness according to ambient light. 2. Sensor.TYPE_ACCELEROMETER The accelerometer is a hardware sensor used to detect a shake, tilt motion, etc. Android reports acceleration force in m/s2. Retrieving values from the accelerometer, you will receive X, Y and Z (including the force of gravity) values that correspond to the following axes (see figure 10).
  • 39. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 39 Figure 10: Accelerometer axes (Source: Android Developers) They are typically used in one of three modes: As an inertial measurement of velocity and position; As a sensor of inclination, tilt, or orientation in 2 or 3 dimensions, as referenced from the acceleration of gravity; As a vibration or impact (shock) sensor. 3. Sensor.TYPE_GRAVITY The gravity is hardware or software sensor, which is originated from the 3-axis acceleration sensor. It measures the vector components of gravity when the device is at rest or moving slowly. When a device is at rest, the gravity sensor should measure equally as the accelerometer. Gravity is a unit vector (scalar = 1). The gravity sensor outputs 3 Cartesian axis values. When a device is accelerated in the ±X, ±Y, or ±Z direction, the corresponding output increases (+) or decreases (-). Acceleration changes generated by gravity are sensed in opposite directions. 4. Sensor.TYPE_GYROSCOPE The gyroscope sensor measures the rate of rotation in rad/s, which is calculated using the measurement data, retrieved from a 3-axis gyroscope. You cannot directly measure angles using a gyroscope. However, often the gyroscope values are integrated over time to calculate an angle. Rotation is positive in the counter-clockwise direction; that is, an observer looking from some positive location on the x, y or z axis at a device positioned at the origin would report positive rotation if the device appeared to be rotating counter clockwise. This is the standard mathematical definition of positive rotation and is not the same as the definition of roll that is used by the orientation sensor. Standard gyroscopes provide raw rotational data without any filtering or correction for noise and drift (bias). In practice, gyroscope noise and drift will introduce errors that need to be compensated for. You usually determine the drift (bias) and noise by monitoring other sensors, such as the gravity sensor or accelerometer.
  • 40. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 40 5. Sensor.TYPE_LINEAR_ACCELERATION The linear acceleration sensor provides you with a three-dimensional vector representing acceleration along each device axis, excluding the force of gravity. This sensor is needed to obtain acceleration data without the influence of gravity. The linear acceleration sensor always has an offset, which you need to remove. The simplest way to do this is to build a calibration step. During calibration you can ask the user to set the device on a table, and then read the offsets for all three axes. You can then subtract that offset from the acceleration sensor’s direct readings to get the actual linear acceleration. The sensor coordinate system is the same as the one used by the acceleration sensor, as are the units of measure (m/s2 ). 6. Sensor.TYPE_MAGNETIC_FIELD Magnetic field sensors report the magnetic field in x, y, and z (by having three separate sensors, one aligned along each axis). Android reports magnetic fields in micro-Tesla (μT). A typical dynamic range is around 2000 micro-Tesla. In the absence of magnetic sensor you can see your position on the Map, but not your orientation (the Map doesn’t rotate when you do). Magnetic sensor helps you to take that first step in the correct direction with the help of the direction pointer. 7. Sensor.TYPE_PRESSURE This constant refers to a MEMS barometer, which measures air pressure. This sensor is currently available only in a few devices its primary use is for determining altitude in places where the device cannot get a GPS fix, such as locations inside a building. This measurement of pressure can be used to forecast short-term changes in the weather and can be used to estimate the altitude. The ambient air pressure is returned with one value in hectopascals or in millibars (hPa or mbar). 8. Sensor.TYPE_PROXIMITY The proximity hardware sensor is used to calculate how far away an object is from a device and it is usually used to determine how far away a person's head is from the face of a handset device (for example, when a user is making or receiving a phone call). Most proximity sensors return the absolute distance in cm, but some return only near and far values in binary. 9. Sensor.TYPE_ROTATION_VECTOR The rotation vector represents the orientation of the device as a combination of an angle and an axis, in which the device has rotated through an angle θ around an axis (x, y, or z) and returns 4 values. The three elements of the rotation vector are <x*sin(θ/2), y*sin(θ/2), z*sin(θ/2)>, such that the magnitude of the rotation vector is equal to sin(θ/2), and the direction of the rotation vector is equal to the direction of the axis of rotation. The fourth value is the scalar component (cos(θ/2)) and it is optional. The rotational vector sensor is particularly versatile and can be used for a wide range of motion-related tasks, such as detecting gestures, monitoring angular change, and monitoring relative orientation changes. For example, the rotational vector sensor is ideal for developing a game, an augmented reality application, a 2-dimensional or 3-dimensional compass, or a camera stabilization app [29].
  • 41. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 41 10. Sensor.TYPE_AMBIENT_TEMPERATURE The ambient (room) temperature sensor measures the ambient air temperature, returning one value in Celsius (˚C). Not many devices have this sensor available. The raw data you acquire from the ambient temperature sensor usually requires no calibration, filtering, or modification. This sensor is meant to replace the use of Sensor.TYPE_TEMPERATURE, which has been deprecated. 11. Orientation Sensor The orientation sensor derives its data by processing the raw sensor data from the accelerometer and the magnetic field sensor. Because of the heavy processing that is involved, the accuracy and precision of the orientation sensor is diminished (specifically, this sensor is only reliable when the roll component is 0). Instead of using raw data from the orientation sensor, the getRotationMatrix() method is used in conjunction with the getOrientation() method to compute orientation values. [29]. The getOrientation() returns 3 values: the azimuth (rotations about the z-axis), pitch (rotation about x-axis), and roll (rotation about the y-axis) in radians. More specifically, the accelerometer, and magnetometer measurements must be passed into getRotationMatrix(), which populates rotation Matrix. The generated rotation matrix is then passed into getOrientation() to get yaw, pitch, and roll. 12. Location Android powered devices have two ways of requesting location updates. The most known and the most accurate one is the GPS (Global Positioning System). However, the GPS locating mechanism can only be used outdoors and it consumes a lot of energy. The other available location strategy is via cellular networks. Android's Network Location Provider determines user location using cell tower and Wi-Fi signals, providing location information in a way that works indoors and outdoors, responds faster, and uses less battery power. In order to receive location updates from NETWORK_PROVIDER or GPS_PROVIDER, user permission must be requested by declaring either the ACCESS_COARSE_LOCATION or ACCESS_FINE_LOCATION permission, respectively in the Android manifest file. Without these permissions, the application will fail at runtime when requesting location updates [29]. A location can consist of latitude, longitude, timestamp, and other information such as bearing, altitude and velocity [29]. All locations are generated by the LocationManager, which provides access to the system location services and allows telling Android when it is interested in receiving updated location and when it no longer wants location updates. The LocationManager also can provide cached location of the device, the current state of the location system such as available location providers, and GPS status information (see figure 11).
  • 42. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 42 Figure 11: Android location components (Source: [12]) All LocatonProviders generate location data differently, but they communicate with an application the same way and provide similar data in the same manner. The LocationListener interface contains a group of callback methods that are called in reaction to changes in a device’s current location or changes in location service state. Objects that implement LocationListener are notified of location updates by a call to their onLocationChanged() method. The specific LocationListener instances which will be notified about a location update are registered with LocationManager. When the LocationManager has a new location to offer, it makes a call to onLocationChanged() for each listener [22]. The Android operating system can provide parameters from sensors that are called either software based or virtual sensors. That means that the data acquired calling these sensors is not data measured directly from a hardware sensor. In this case, the values can be computed from several hardware-based sensors. An example of that is the orientation sensor described above, that combines the data from the magnetic field sensor and the accelerometer. Other cases of virtual sensors are the linear acceleration sensor and the gravity sensor that both compute their values from the accelerometer hardware based sensor. A fact that makes easier to develop sensor based Android applications is the fact that the Android sensor framework provides very accurate values from the sensors and well formatted in units of the SI (International System of Units). Android does not specify a standard sensor configuration for devices, which means that device manufacturers can incorporate any sensor configuration they want into their Android-powered devices. As a result, devices can include a variety of sensors in a wide range of configurations [15].
  • 43. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 43 4.3 Development Requirements and Constraints While it comes to design a framework are quite a few documented methodologies. Most of them, though, admit the fact that the objective is to identify abstractions with a bottom-up approach: start by examining existing solutions and be able to generalize from them. When adopting a framework development process it is important to take into account that any framework will usually start as a white-box framework and ideally evolve into a black-box one. It is also important to understand that developing a framework is more difficult than developing an application because a framework is a piece of software you need to create your application. It can be a library, a collection of many libraries, a collection of scripts etc. There are two different activities in framework development: core framework design and framework internal increments. The core framework design comprises both abstract and concrete classes in the domain. The concrete classes are intended to be invisible and transparent to the end user (e.g. a basic storage utility) while the abstract classes are either intended to be invisible or to be used through subclassing. On the other hand framework internal increments build additional classes that form a number of class libraries. A number of activities can be identified in the first development phase, namely domain analysis, architectural design, framework design, framework implementation, framework testing an application testing. With frameworks, developers don't have to start from scratch every time they write a particular application. Frameworks help developers to devote their time to meeting software requirements rather than dealing with the more standard low-level details reducing in this way the overall development time. They also provide a well- designed and thought out infrastructure so that when new pieces are created and added, they can be integrated with minimal impact on other pieces of the framework. A few design principles we had while designing our framework were: 1. Implement the high-level requirements through Abstract classes and Interfaces 2. Provide utility classes that might be useful for Framework users. 3. Consider what should be internal - kind of metadata - that shouldn't be shown to framework users ( application developers ). 4. Name things clearly, but concisely. This is probably one of the hardest tasks for framework developers - especially when implementing abstract or generalized concepts. Some names have multiple meanings and may be interpreted differently than you might expect. Talking more precisely about SENSOROID as a sensing framework that is designed to be useful not just for Android operating systems an important constraint was to hide Android specific code about sensors and manage with context. At this point we made some decisions while implementing this framework:  We chose interfaces rather than abstract classed because interfaces promote composition over inheritance. Inheritance from our point of view introduces a whole extra layer of complexity as you need to work out the relationships between superclasses and subclasses while interfaces are usually cleaner for the reason we mentioned above.
  • 44. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 44  Our main objective was to think about what the developer should expect. Basically, this relates to data access from sensors. We tried to make it flexible and useful for framework users to get sensor data without the requirement of Android development.  Through all the development steps we designed for testability.Below we explain a sample application which others can use to understand the framework functionality.  Finally, we think that SENSOROID is easy to expand for new features and to grow maintaining this design structure. 4.4 Framework Architectural Design It was presented before in the text the list of Android structures used in the sensing framework implementation. These features are used for packaging, storing or sending sensor data and creating in this way the basis of a common interface, creating in this way the basis of a communication bridge between framework users and sensor drivers. The figure nr.13 shows the architecture design of the SENSOROID framework in every level via the Sensor Interface and the Sensor Data Listener. Framework users (developers) only need to implement the specific methods that handle sensor data received from the bottom level. The framework supports multiple calls by providing abstractions that hide sensor specific Android properties. SENSOROID exposes all 11 built-in Android sensors (for Android 4.4.2 and greater) creating a single communication channel to manage them. 4.5 Framework Development Analysis It is important in this part to talk about how SENSOROID framework was developed by analyzing step by step all the programming approaches we made to achieve the desired result. There are three main points we have to emphasize while explaining the Figure 12: SENSOROID architecture of all levels
  • 45. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 45 framework program structure: sensor interface, sensor manager via services and sensor data acquisition. Our framework is composed of Java files (Android 4.4.2 or greater) and for every package it is explained what functionality it has and the process of components cooperation. 4.5.1 Framework Interface Sensor interface’s role is to communicate with low-level tasks offering encapsulation of complexities linked to each sensor. It enables to developers to implement all the built-in Android sensors through two methods:  startSensor (Sensor Type);  stopSensor (Sensor Type); These two methods are defined in the file Sensor.java. As it is visible, sensor interface requires the sensor type to control a specific sensor in the framework, meaning with sensor type e.g Light, Accelerometer, Pressure, etc. Once the method startSensor(Sensor Type); is called a service starts working on the background for each sensor type the developer wants. In addition to, the service runs as long as stopSensor (Sensor Type); method is called. 4.5.2 Sensor Interface Implementation In the file SensorImpl.java we developed the implementation of sensor interface. For each sensor type is implemented the start and the stop of the responsible service. To make these operations is needed to call startService() and stopService(). As services are an application android component and a subclass of Context we had to retrieve context inside the framework to make it a generic and easy-to-use program for developers as we mentioned before. So, to realize this we used ApplicationContextProvider.java util class that provides Context in every Java class in a static way. It could lead at some problems generally, but in this particular case this approach is a good practice because we have to do with passing a context beyond the scope of an Activity and it helps also to avoid memory leaks. To start sensor services from an activity we use an explicit intent with startService(). The method returns immediately and the Android system calls the service responsible method to begin running. The intent delivered is the only communication node between the application component and the service. 4.5.3 Sensor Manager via Services The entry point to the Android Sensor API is the SensorManager class, which allows to request sensor information and register to receiver sensor data. For each call to the service, the Sensor Manager dispatches the commands to the appropriate sensor object that, in turn, utilizes a sensor driver to perform specific low-level tasks.
  • 46. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 46 4.5.3.1 Sensor Services Once the startService(…) method is called the onStartCommand(Intent intent, Int flags, intstartId) method creates the SensorManager running service on the background. You can register just one sensor per service so we have a different service file for each sensor type. The service will at this point continue running until stopService() or stopSelf() is called no matter how many times it is started. 4.5.3.2 Sensor Manager SensorManager is the Android system service that gives an app access to hardware sensors. Like other system services, it allows apps to register and unregister for sensor events. Once registered, an app will receive sensor data from the hardware. You can access SensorManager via getSystemService(SENSOR_SERVICE). The Sensor class defines several constants for accessing the different sensors (e.g Sensor. TYPE_ACCELEROMETER). You can access the sensor via the sensorManager.getDefaultSensor() method, which takes the sensor type and the delay defined as constants on SensorManager as parameters. The sensor returned from getDefaultSensor() method may be either a raw sensor or a synthetic sensor that manipulates raw sensor data [22]. 4.5.4 Gathering Sensor Data We give the possibility in our framework of data acquisition from the sensors and in this part we explain which are the classes and the components involved in this process and their relationship. Figure 13: Sensor Manager interaction with sensor hardware (Source:Tizen Project)
  • 47. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 47 4.5.4.1 Sensor Event Listener Interface The SensorEventListener is an interface that provides the callbacks to alert an app to sensor-related events. To be made aware of these events, an app registers a concrete class that implements SensorEventListener with the SensorManager. The SensorEvent is the data structure that contains the information that is passed to an app when a hardware sensor has information to report. A SensorEvent object is passed from the sensor system service to callback methods on SensorEventListener. The listener then processes the data in a SensorEvent object in an application-specific manner. The onSensorChanged(SensorEvent event) method is called every time a sensor value is changed. As a common listener Sensor Event Listener should be registered and unregistered when the SensorManager service respectively starts and stops. 4.5.4.2 Custom Data Listener Interface The Custom Data Listener Interface also as Sensor Interface provides the access point for the application developers to interact with SENSOROID framework. We said before that we can provide access to sensor data through SensorEventListener, but this is an Android interface and we need our framework to be used not just from Activities or Fragments so to achieve this we developed our own Listener Interface that is called also every time sensor values are changed. The method developers can use to manage sensor data by implementing the listener is receiveSensorData() that maps them for each sensor type in a uniform way. Below you can see the sequence diagram of the listening process by the Custom Listener. 4.5.4.3 Broadcast Receivers Besides a Custom Data Listener Interface the possibility to gather sensor data can offer also the Broadcast Receiver class through its abstract method onReceive(Context, Intent). In the beginning, we started using receivers to get data values, but it was difficult to keep a high level of abstractness because of the Android components such as Context or Intents that receivers need to get triggered. Diagram 1: Custom Data Listener Interface activity sequence diagram
  • 48. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 48 4.6 Data Flow Modulation  Framework Class Diagram In the previous sections of our paper, we described the SENSOROID framework architecture and implementation in details explaining every system unit and their communication channels. To make clear the interactions between classes above is presented the class diagram. It is clearly visible the independence between classes achieving in this way the abstractness we need for our framework. Diagram 2: SENSOROID class diagram
  • 49. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 49  Start and Stop Sensor Sequence Diagram In this sequence diagram we show how we dealt with context – awareness in the sensor interface and our solution to hide Android specific code for sensors. Diagram 3: Start Sensor method activity sequence diagram
  • 50. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 50  Stop Sensor Sequence Diagram It was implemented the same pattern to call stop sensor method in the framework interface. As it seems from the diagram every sensor follows a common way to interact with services having in this way an easy structure to manage. Diagram 4: Stop Sensor method activity sequence diagram
  • 51. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 51 4.7 Test and Extensibility 4.7.1 Testing on the real device versus the simulator Before testing our framework on an actual device, we have tried to simulate the motion and to retrieve the sensor data using the OpenIntents SensorSimulator (see figure 14). This simulator also lets you simulate the battery level and GPS position using a Telnet connection. It currently supports accelerometer, compass, orientation, temperature, light, proximity, pressure, gravity, linear acceleration, rotation vector and gyroscope sensors, where the behavior can be customized through various settings. Figure 14: SensorSimulator overview (Source: OpenIntents) Although, the emulator does not represent the specific implementation of the Android platform that is unique to a given device. It does not use the same hardware to determine signal, networking, or location information [23]. Even though this method helps involving sensors not embedded in most common devices, we have chosen to focus on an actual device functional testing. The MainActivity class is the key execution component of SENSOROID framework that represents a single screen with a simple user interface. This class helps verifying the detailed design of the framework on a real Android mobile device.
  • 52. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 52 When the user selects the SENSOROID app icon from the Home screen (see figure 15), the system calls the onCreate() method for the Activity in the app that is declared to be the "launcher" (or "main") activity. This is the activity that serves as the main entry point to the app's user interface. Figure 15: Real device testing UI The buttons, Start Light Sensor and Stop Light Sensor call the functions startSensor(SensorType.Light) and stopSensor(SensorType.Light), respectively. After that the screen toasts multiple event values retrieved from the device’s sensor. To show how an Activity can cooperate with the framework below is another class diagram, but this time with Main Activity class. As we explained before in Activity class the developer can implement the Sensor Interface and the Custom Data Listener and make further extensions work with Android sensors.
  • 53. Developing a sensing framew ork for mobile devices A.Gabaraeva , I.Iliadhi 53 Figure 16: Main Activity Class Diagram To check how many sensors a specific testing device can possibly have, the application name PhoneTester comes very heplful (see figure 16). Figure17: Phone Tester Application UI