The goal of the Sensorpedia Mobile Platform was to develop a software application which can be run on a mobile computing device and used to transmit distributed data readings from citizens in near-real time. This application permits citizens to rapidly report event information, such as disaster occurrences, security anomalies, and accounts of emergencies.
Gesture Based Interface Using Motion and Image Comparisonijait
This paper gives a new approach for movement of mouse and implementation of its functions using a real time camera. Here we propose to change the hardware design. Most of the existing technologies mainly depend on changing the mouse parts features like changing the position of tracking ball and adding more buttons. We use a camera, colored substance, image comparison technology and motion detection technology to control mouse movement and implement its functions (right click, left click, scrolling and double click) .
Gesture Based Interface Using Motion and Image Comparisonijait
This paper gives a new approach for movement of mouse and implementation of its functions using a real time camera. Here we propose to change the hardware design. Most of the existing technologies mainly depend on changing the mouse parts features like changing the position of tracking ball and adding more buttons. We use a camera, colored substance, image comparison technology and motion detection technology to control mouse movement and implement its functions (right click, left click, scrolling and double click) .
Artifacts and Symbols of everyday life from KeralaAnand Nair
A random collection of pictures taken around Central Kerala (Kottayam, Pathanamthitta) documenting the leftovers of a bygone era. This project was done during the summer of 2003-04 as a student project.
This is a presentation on a project I completed as a freshman as part of the "Computers, Instuments, and Data Acquisition Techniques" course taught by Dr. Bob Kremens of RIT.
We looked at the data. Here’s a breakdown of some key statistics about the nation’s incoming presidents’ addresses, how long they spoke, how well, and more.
My books- Hacking Digital Learning Strategies http://hackingdls.com & Learning to Go https://gum.co/learn2go
Resources at http://shellyterrell.com/emoji
Artificial intelligence (AI) is everywhere, promising self-driving cars, medical breakthroughs, and new ways of working. But how do you separate hype from reality? How can your company apply AI to solve real business problems?
Here’s what AI learnings your business should keep in mind for 2017.
A SEMINAR PRESENTATION
On
SIXTH SENSE TECHNOLOGY
Submitted in partial fulfillment of the award of the degree
of
Bachelor of Technology
in
ELECTRONICS & COMMUNICATION ENGINEERING
An AI Based ATM Intelligent Security System using Open CV and YOLOYogeshIJTSRD
Nowadays most of the surveillance cameras in ATM doesn’t record with detail for analysis of incidents. Due to this most of the ATM cases gets unsolved. In this paper a system to improve ATM security is proposed. The proposed system deals with the development of a application using Open CV, YOLO and AI for automation of video surveillance in ATM machines and detect any type of potential criminal activities that might be arising. Prem Krishna | Saheel Ahamed | Roshan Kartik "An AI Based ATM Intelligent Security System using Open CV and YOLO" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-4 , June 2021, URL: https://www.ijtsrd.compapers/ijtsrd41232.pdf Paper URL: https://www.ijtsrd.comengineering/computer-engineering/41232/an-ai-based-atm-intelligent-security-system-using-open-cv-and-yolo/prem-krishna
'Sixth Sense' is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.
Towards the Design of Intelligible Object-based Applications for the Web of T...Pierrick Thébault
Presentation given at the second international workshop on the Web of Things (in conjunction with the ninth international conference on pervasive computing, san francisco, usa, june 2011).
More details on http://www.wothings.com.
-Integrating information with real world.
-The s/w program processes the video stream data captured by the camera and tracks the location of colored markers using simple computer vision techniques.
Artifacts and Symbols of everyday life from KeralaAnand Nair
A random collection of pictures taken around Central Kerala (Kottayam, Pathanamthitta) documenting the leftovers of a bygone era. This project was done during the summer of 2003-04 as a student project.
This is a presentation on a project I completed as a freshman as part of the "Computers, Instuments, and Data Acquisition Techniques" course taught by Dr. Bob Kremens of RIT.
We looked at the data. Here’s a breakdown of some key statistics about the nation’s incoming presidents’ addresses, how long they spoke, how well, and more.
My books- Hacking Digital Learning Strategies http://hackingdls.com & Learning to Go https://gum.co/learn2go
Resources at http://shellyterrell.com/emoji
Artificial intelligence (AI) is everywhere, promising self-driving cars, medical breakthroughs, and new ways of working. But how do you separate hype from reality? How can your company apply AI to solve real business problems?
Here’s what AI learnings your business should keep in mind for 2017.
A SEMINAR PRESENTATION
On
SIXTH SENSE TECHNOLOGY
Submitted in partial fulfillment of the award of the degree
of
Bachelor of Technology
in
ELECTRONICS & COMMUNICATION ENGINEERING
An AI Based ATM Intelligent Security System using Open CV and YOLOYogeshIJTSRD
Nowadays most of the surveillance cameras in ATM doesn’t record with detail for analysis of incidents. Due to this most of the ATM cases gets unsolved. In this paper a system to improve ATM security is proposed. The proposed system deals with the development of a application using Open CV, YOLO and AI for automation of video surveillance in ATM machines and detect any type of potential criminal activities that might be arising. Prem Krishna | Saheel Ahamed | Roshan Kartik "An AI Based ATM Intelligent Security System using Open CV and YOLO" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-5 | Issue-4 , June 2021, URL: https://www.ijtsrd.compapers/ijtsrd41232.pdf Paper URL: https://www.ijtsrd.comengineering/computer-engineering/41232/an-ai-based-atm-intelligent-security-system-using-open-cv-and-yolo/prem-krishna
'Sixth Sense' is a wearable gestural interface that augments the physical world around us with digital information and lets us use natural hand gestures to interact with that information.
Towards the Design of Intelligible Object-based Applications for the Web of T...Pierrick Thébault
Presentation given at the second international workshop on the Web of Things (in conjunction with the ninth international conference on pervasive computing, san francisco, usa, june 2011).
More details on http://www.wothings.com.
-Integrating information with real world.
-The s/w program processes the video stream data captured by the camera and tracks the location of colored markers using simple computer vision techniques.
We issued 20 young coders with smartphones pre-loaded with an app that gathered data on the network activity of the other apps they used. Their data was captured using the Python-based data portal CKAN, analysed with SciKit-Learn, then returned to them using Docker and the Ipython Notebook. Python also played a role in the reverse-engineering of some of the more interesting apps we discovered.
Sixth Sense Technology is a mini-projector coupled with a camera and a
cellphone—which acts as the computer and connected to the Cloud, all the
information stored on the web. Sixth Sense can also obey hand gestures. The
camera recognizes objects around a person instantly, with the micro-projector
overlaying the information on any surface, including the object itself or hand.
Also can access or manipulate the information using fingers. make a call by
Extend hand on front of the projector and numbers will appear for to click.
know the time by Draw a circle on wrist and a watch will appear. take a photo
by Just make a square with fingers, highlighting what want to frame, and the
system will make the photo—which can later organize with the others using
own hands over the air.and The device has a huge number of applications , it is
portable and easily to carry as can wear it in neck.
The drawing application lets user draw on any surface by observing the
movement of index finger. Mapping can also be done anywhere with the
features of zooming in or zooming out. The camera also helps user to take
pictures of the scene is viewing and later can arrange them on any surface.
Some of the more practical uses are reading a newspaper. reading a newspaper
and viewing videos instead of the photos in the paper. Or live sports updates
while reading the newspaper.
The device can also tell arrival, departure or delay time of air plane on
tickets. For book lovers it is nothing less than a blessing. Open any book and
find the Amazon ratings of the book. To add to it, pick any page and the device
gives additional information on the text, comments and lot more add on feature
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
1. The Mobile Platform:
Enabling citizen sensors
Christopher H.Tomkins-Tinch
RIT BS, Imaging Science (2011)
tomkinstinch at ornl dot gov
cht(nine-three-three-nine) at rit dot edu
Hi, Iʼm Chris Tomkins-Tinch from the Rochester Institute of Technology. This summer, I worked on creating an iPhone application interface to Sensorpedia and an associated backend web
service.
3. http://weblogs.newsday.com/news/local/longisland/politics/blog/anemometer.jpg
http://biosingularity.files.wordpress.com/2006/03/cantilever.jpg
http://www.kenrockwell.com/nikon/d40/images/d40-sensor.jpg
We might think of an a weather station--an anemometer, or maybe a camera sensor. These are largely stationary sensors designed for specific tasks.
5. We too are sensors.
We too then, are sensors. We can respond in ways static sensors cannot. We can notice things that autonomous systems may miss. We can frame readings with relevant context, and
quickly capture what matters.
11. Mobile phones
are ubiquitous.
Almost everyone in this country has a cell phone. In the decade following 1996, cell phone adoption jumped from thirty-four to two-hundred three million. Thatʼs a significant increase.
http://www.post-gazette.com/pg/06075/671034-294.stm
13. http://www.flickr.com/photos/davidwatts1978/3199405401/
This photo of US Flight 1549 was taken by Janis Krums on January 15, 2009 while standing on the bank of the Hudson River. It was captured with an iPhone, and distributed by Twitter
before any news agency of record had reached the scene.
14. http://www.fulana.org/images/if-you-see-something.JPG
http://www.flickr.com/photos/usgeologicalsurvey/2593475733/
http://www.narragansett.k12.ri.us/nes/images/citizen.gif
Whether acting in response to natural disasters, in the service of national security, or for curiosity, citizens are are a powerful source of data. The question then is, “How do we leverage this
power in an organized way?”
15. I wonʼt go into detail on what Sensorpedia is; that is outside the scope of this presentation. In short, it is a way to aggregate and index sensor data.
16. Here we have a screenshot of the BETA Sensorpedia interface. The popup window in the middle shows a stationary traffic camera in South Carolina.
17. Sensorpedia is currently indexing many different types of in-situ sensors. Here we can see numerous weather and buoy sensors.
18. Imagine if Sensorpedia included near-real-time human Sensor readings. Here is a screenshot of Flickrvision. As photographs are uploaded to Flickr, this service shows a thumbnail on
Google Maps. What if we could leverage the same functionality with Sensorpedia?
19. How do we get human
readings into ?
The question then, is “How do we get readings from distributed users into Sensorpedia?”
20. Create a mobile
application.
We can do it with a mobile phone application--something to interface with the native device hardware and serve data to Sensorpedia in an intuitive way. This summer I targeted the iPhone
for this task.
21. Initially, I began with a sketch-based interface design. This is something Iʼve learned to embrace from working with David Resseguie. Itʼs a low-risk and quick way to think about how
prototypes should work. Sketching on a tablet pc is especially helpful because it makes the iterative design process simple.
22. The main idea is to give each iPhone owner a feed on Sensorpedia. They can then capture the deviceʼs senses--accelerometer, camera, and position. They can also provide an optional
comment. A more generic data field is also included, anticipating sensors that interface through the iPhoneʼs dock connector.
23. Users experience native iPhone user interface controls, and will be able to review post-capture entries.
24. The app provides a historical backlog. From the Sensorpedia web interface, the API, or the Sensorpedia mobile app itself, users will be able to review past entries.
25. Rather than talk directly to Sensorpedia, the mobile application will send data to an associated web service which will then in turn generate feeds to be later consumed by Sensorpedia.
26. Left as an exercise
for the reader.
http://www.flickr.com/photos/austinevan/1225274637/
This was an educational program after all, and I needed to learn. Dave Markʼs iPhone book, and Stephan Kockanʼs Objective-C books were invaluable for gaining knowledge of how to
develop for Appleʼs mobile platform.
27. After getting comfortable with Objective-C and Appleʼs way of doing mobile development, I had an early Sensorpedia application running on the iPhone simulator.
28. It presents an interface that makes capturing the iPhoneʼs senses simple and quick.
30. In addition to being able to capture using the large button on the first tab, users can opt to have the app capture immediately upon load. This makes it easy to catch time-sensitive events.
31. Each user of the application will need a username and password. Having this credential obviates the need to tie the app to specific devices, and permits fleet deployment. Data from any
device will make it to the same feed, as long as it comes from the same user.
32. The app handles errors gracefully, using windows according to Appleʼs user interface guidelines.
33. In accordance with Appleʼs user interface guidelines, the app also ties in to the iPhoneʼs global settings pane.
34. The Sensorpedia iPhone app settings are available from a global location familiar to the user.
37. Making the mobile
application does not
complete the system.
Creating the mobile application does not get information from field users into Sensorpedia, however. A web service is needed to complete the system.
38. You may remember this diagram from earlier in the presentation. Weʼve covered the iPhone side of things, and the Sensorpedia side remains largely complete. The “Third party web
service” in the middle of this diagram is needed collect data from iPhones and provide it to Sensorpedia as a compatible ATOM feed.
40. (it’s all code)
Thereʼs not much to see for the web interface. Itʼs about a thousand lines of Python code, executing through Apache with mod_wsgi. It makes use of existing libraries for JSON, database
interaction, and wsgi. It uses webpy.
41. It has a RESTful interface.
'http://baseurl/auth'
'http://baseurl/submit'
'http://baseurl/query/(userId)/[returnCount]/[offset]'
'http://baseurl/delete'
'http://baseurl/image/(imageId)'
'http://baseurl/imagethumb/(imageId)/[maxSize]'
Link Sensorpedia itself, the web service has a RESTful interface, and responds to normal HTTP operations like GET and POST. It collects data in JSON format, stores it locally in a SQLite
database, and serves it in varying ways, depending on content type.
42. Readings Users
id id
user_id api_key
image password_hash
image_thumb username
accel_x date_registered
accel_y date_last_seen
accel_z sp_uuid
magnetometer
altitude
comment
lon
lat
altitude
generic_content
timestamp
The data model is quite simple, and provides tables for users and readings. Each of the iPhoneʼs sensors is given a column, along with a generic_content column for extensibility.
43. One of the resources I used in creating the web backend for the Sensorpedia iPhone app is the Sensorpedia Python Library, or spylib as weʼve been calling it. Tim Garvin created it during
his time here at ORNL. Spylib takes the raw data and transforms them into a Sensorpedia-compatible ATOM feed.
45. http://www.flickr.com/photos/howieluvzus/389163804/
The next thing to do is to test get the app fully working actual hardware. Eighty percent of the application was developed using the iPhone simulator. This works well. The simulator provides
constant canned values for the GPS and magnetometer. A photo album can be used to simulate taking a picture, and a hack allows one to simulate the accelerometer on a MacBook Pro.
Simulating the phone accelerometer with the MacBook Pro's Sudden Motion Sensors can be accomplished using http://bit.ly/18uGq8 to send data via UDP to a waiting accelerometer
override http://bit.ly/m6E7E . It is especially neat because it's transparent to the developer. Deploying to ARM swaps in the native accelerometer thanks to #if !TARGET_CPU_ARM ; no
changes to the interface. Once the application is complete and running on the device, it can be submitted to Apple for entry to the App Store.
46. The mobile platform
Enabling citizen sensors
Christopher H. Tomkins-Tinch David R. Resseguie
Rochester Institute of Technology Oak Ridge National Laboratory
Imaging Science (BS 2011) Computational Sciences and Engineering
tomkinsc@gmail.com resseguiedr@ornl.gov
We too are sensors. The Application
Prior work - Interface design
Sharing information from sensors and other real-time data systems The Sensorpedia mobile application was designed to capture data
is critical for situational awareness, knowledge discovery, and from the iPhone's sensors and relevant context as quickly as possible.
simulation. Current proprietary systems and ad-hoc interoperability This timely situational awareness is critical. User interaction for the
solutions increase sensing system integration time and delay Sensorpedia mobile application was prototyped with sketch-based
decision-making and data analysis. interfaces. This technique allowed for rapid iterative design, and
greatly accelerated the development process.
Sensorpedia is a program initiated by Oak Ridge National
Laboratory (ORNL) to utilize "Web 2.0" social networking principles
and lightweight data portability standards to organize and provide
rapid access to online sensor network data and related data sets.
Sensorpedia has leveraged established industry best practices for
information dissemination, and provides an accessible application
programming interface (API) for retrieving data in standardized
forms such as JavaScript Object Notation (JSON), and the Atom
Syndication Format. Data feeds can be tagged with keywords and
indexed in the spatial and temporal domains. Sensorpedia
enables decision makers to pull in many different types of
information sources and view them on a common operating
picture.
Intention - citizen sensors
The goal of the Sensorpedia Mobile Platform was to develop a
software application which can be run on a mobile computing
device and used to transmit distributed data readings from citizens
in near-real time. This application permits citizens to rapidly report
event information, such as disaster occurrences, security
anomalies, and accounts of emergencies.
This photo of US Flight 1549 was taken by
Janis Krums on January 15, 2009 while
standing on a ferry crossing the Hudson
River. It was captured with an iPhone, and
distributed by Twitter before any news
agency of record had reached the scene.
http://www.flickr.com/photos/davidwatts1978/3199405401/
Implementation
The Sensorpedia mobile application is implemented in Objective-C,
and utilizes native Apple Foundation and Cocoa classes. It adheres to
Architecture
the model-view-controller paradigm for software development.
The associated web service is implemented in Python and utilizes
existing libraries for its web server gateway interface (WSGI), image
Web Service processing, JavaScript Object Notation (JSON), and database
interaction. Data are stored in a relational SQLite database and can be
The creation of the mobile application necessitated an associated retreived through a Representational State Transfer (REST) interface.
web service, which collects and registers data with Sensorpedia.
Data flow
ATOM
JSON Data model
Users maintain accounts to allow Users Readings
web
each user to be associated with magnetometer
service multiple devices. Data are id id
altitude
api_key user_id
associated with users through an password_hash image
comment
API key. Each of the sensors lon
username image_thumb
lat
The web service communicates with devices using JavaScript present in the device and a date_registered accel_x
altitude
generic content block are date_last_seen accel_y
Object Notation (JSON) and publishes data using the Atom sp_uuid accel_z
generic_content
Syndication Format (ATOM). represented in the data model. timestamp
Sensorpedia is funded by the Department of Homeland Security’s Southeast Region Research Initiative (SERRI) and is supported in
part by the U.S. Department of Energy under DOE Project No. 2367-T103-06. For more information about Sensorpedia and related
efforts at ORNL, please contact Bryan Gorman at gormanbl@ornl.gov or visit Sensorpedia online at http://www.sensorpedia.com
This is the poster presented on the application at the 2009 ORNL Summer Student Poster Session. on August 5, 2009.
47. The Mobile Platform:
Enabling citizen sensors
Christopher H.Tomkins-Tinch
RIT BS, Imaging Science (2011)
tomkinstinch at ornl dot gov
cht(nine-three-three-nine) at rit dot edu
Hi, Iʼm Chris Tomkins-Tinch from the Rochester Institute of Technology. This summer, Iʼve worked on creating an iPhone application and an associated backend server to interface with
Sensorpedia.