SMART RESOURCE-AWAREMULTI-SENSOR NETWORKINTERREG IV RESEARCH PROJECTAutonomous complex event detectionin scenarios with limited infrastructure Klagenfurt, September 2, 2011MASSIMILIANO VALOTTOPAOLO OMEROSABRINA LONDEROvaloo@infofactory.it -‐ email@example.com -‐ firstname.lastname@example.org hp://www.infofactory.it 1
MAIN GOAL : SMART MULTISENSOR NETWORKDesigning a smart resource-aware MULTISENSOR NETWORK capable of autonomously DETECTING andLOCALIZING various EVENTS such as screams, animal noise, tracks ofPERSONS and more COMPLEX HUMAN %BEHAVIOURS." 2
RESEARCH AREAS1. NETWORK RECONFIGURATION 2. AUDIO/VIDEO ANALISYS Due to limited resources, the sensors Video frames and audio signals are analyzed network should be able to reconﬁgure itself in in order to recognize objects and sounds. We order to limit consumes (for example can idenKfy for example the type, speed, switching oﬀ cameras when nothing happens direc2on and the coordinates of a moving in that area). object. It is possible to recognize diﬀerent classes of objects such as humans, cars, dogs and cows. 3. COMPLEX EVENT DETECTION Seman2c analysis is performed over data 4. MULTIMEDIA DB, RETRIEVAL & ANALYSIS extracted during audio and video analysis, % in order to detect complex events, such as The MulKMedia DB is devoted to archive the for example video and audio ﬁles received from sensors. <people shoo2ng to deers> Furthermore the system is consKtuted by an <person walking in a restricted area> advanced access & retrieval & knowledge-‐3 <dog ﬁgh2ng with person> discovery layer For this purpose we use an ontological model and a rules engine.
NETWORK SOLAR POWERED AUTO RECONFIGURABLEACQUISITION VIDEO AUDIO PICTURESANALYSIS SOUND DETECTION OBJECT RECOGNITION LOCALIZATIONCOLLECTING SEMANTIC ANALISYS COMPLEX EVENT DETECTED MULTIMEDIA & EVENTS ARCHIVEDATA MINING
1. NETWORK RECONFIGURATIONOperate the network at highest possible performancewhile minimizing resource usage." Change power mode of nodes and components Dynamically adapt network structure and node conﬁgura2on according to Find op2mal resource current applica2on requirements alloca2on in the network LOW ACTIVITY à exchange only status informa2on, power down as many sensors as Move cameras in order to possible follow the scene of ac2on HIGH ACTIVITY à exchange control and data and switch on a camera messages, ac2vate as much sensors as when something is expected needed to happen in a speciﬁc area
2. AUDIO & VIDEO ANALYSIS3D Localization, recognition and classiﬁcation of audio sources. " Localiza2on of sound sources with 2me diﬀerence of arrival (TDOA) Classiﬁca2on of audio sources. Iden2fy speciﬁc sound paRerns based on characteris2c features waves hit the microphones at Examples: barking dogs, diﬀerent 2me instances TDOA is shou2ng humans related to the line of origin of the sound wave
2. AUDIO & VIDEO ANALYSISAnalysis and PTZ-Cameras re-conﬁguration. " Detect simple paRerns SOLUTION: of ac2vity on a ground Project real world on camera-‐based reference map. system Cover the paRerns with The new conﬁgura2on conic sec2ons op2mally covers the area represen2ng the wrt. the ac2vi2es occurring in it. observed zone for each video sensor
3. COMPLEX EVENT DETECTIONDetect simple and complex events by means of aconsistent ontology. " Deﬁne simple and complex events by means of a consistent ontology Describe the events’ context, ie., spa2al, temporal, object and event rela2onships Apply reasoning mechanisms to iden2fy complex events from low level features
4. MULTIMEDIA DATA BASE, RETRIEVAL & ANALYSISCollect multimedia data from each sensor, saveevents, and perform advanced analysis." Store mul2media data, low Find paRerns in data level features, simple and Recurring events (e,g. Visitors are used to stop in a speciﬁc area) complex events in a Find rela2ons between events (event mul2media database “a deer is detected in the morning in AREA 1” is ocen followed by “the deer is detected in AREA 2 in the Provide user interface for acernoon”)à path discovery operators – High-‐level view of “what is going on“ Alert an operator Formulate complex queries Alert an operator using mobile (e.g.,all events in a certain devices. area, the areas most Provide a mobile interface to access the event descrip2on and the audio/ frequented by bears, the video data sensors less ac2ve, …)
AN EXAMPLE OF THE EVENT DETECTION PROCESS A camera recognizes a deer" " A shot is detected by a microphones array in the same area" The position of the hunter is computed" The network is reconﬁgured to look at the hunter position" The person (hunter) is detected by a camera" The system alerts an operator and sends the event description “a hunter shot a deer” and the audio/video data"
POWER SEARCH.The user interface allows users toperform powerful retrievaloperations over the collected data andadvanced statistical analysis to getknowledge from the archive.The basic access metaphor used forquerying the archive is a what/where/when three dimensional space. 11
EVENTS.The search results are visualized andcan be navigated following an event/place/network three dimensionalapproach.The events view shows the list of eventsresulted from the search. For eachevent we can see the date, the involvedsubjects, the action and, if defined,the zone where it happened. We canalso see a map showing the exactposition of the event and any relatedmultimedia content (videos, imagesor audio). 12
DATA MINING.The application offers to the user alsosome advanced statistical analysis,useful to get knowledge from thearchive.Some examples regard the distributionof events of different types over time/inspecific periods or the trend of theactivity of sensors. 13