This document describes a system for supporting improvisational music ensembles using smartphone sensors. The system aims to allow beginners to play improvised melodies that are consonant with a background tune by tracking the vertical motion of a user's hand. It uses Bayesian networks trained on motion data to estimate pitch notation, attack timing, and the vertical motion of the hand based on smartphone sensor values. Evaluation experiments show the system can estimate pitch notation with over 70% accuracy but has low precision for attack timing estimation. Future work includes improving motion tracking, adding ensemble support for multiple users, and collecting supervised training data using a Kinect camera.
This paper describes a decision tree (DT) based pedometer algorithm and its implementation on
Android. The DT- based pedometer can classify 3 gait patterns, including walking on level
ground (WLG), up stairs (WUS) and down stairs (WDS). It can discard irrelevant motion and
count user’s steps accurately. The overall classification accuracy is 89.4%. Accelerometer,
gyroscope and magnetic field sensors are used in the device. When user puts his/her smart
phone into the pocket, the pedometer can automatically count steps of different gait patterns.
Two methods are tested to map the acceleration from mobile phone’s reference frame to the
direction of gravity. Two significant features are employed to classify different gait patterns.
Collecting big data in cinemas to improve recommendation systems - a model wi...ICDEcCnferenece
Kristian Dokic, Domagoj Sulc and Dubravka Mandusic. Collecting big data in cinemas to improve recommendation systems - a model with three types of motion sensors. (ICDEc 2021)
This paper describes a decision tree (DT) based pedometer algorithm and its implementation on
Android. The DT- based pedometer can classify 3 gait patterns, including walking on level
ground (WLG), up stairs (WUS) and down stairs (WDS). It can discard irrelevant motion and
count user’s steps accurately. The overall classification accuracy is 89.4%. Accelerometer,
gyroscope and magnetic field sensors are used in the device. When user puts his/her smart
phone into the pocket, the pedometer can automatically count steps of different gait patterns.
Two methods are tested to map the acceleration from mobile phone’s reference frame to the
direction of gravity. Two significant features are employed to classify different gait patterns.
Collecting big data in cinemas to improve recommendation systems - a model wi...ICDEcCnferenece
Kristian Dokic, Domagoj Sulc and Dubravka Mandusic. Collecting big data in cinemas to improve recommendation systems - a model with three types of motion sensors. (ICDEc 2021)
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning SystemAlwin Poulose
Pedestrian Dead Reckoning as an Indoor Positioning System,Step detection and step estimation ,kims approach ,Smartphone-based Pedestrian Dead Reckoning
This presentation was prepared by Ishara Amarasekera based on the paper, Activity Recognition using Cell Phone Accelerometers by Jennifer R. Kwapisz, Gary M. Weiss and Samuel A. Moore.
This presentation contains a summary of the content provided in this research paper and was presented as a paper discussion for the course, Mobile and Ubiquitous Application Development in Computer Science.
Gait Recognition using MDA, LDA, BPNN and SVMIJEEE
Recognition of any individual is a task to identify the human beings. Human identification using Gait is method to identify an individual by the way he walk or manner of moving on foot of humans. Gait recognition is a type of biometric recognition and related to the behavioral characteristics of biometric recognition. Gait offers ability of distance recognition or at low resolution. In this paper it will present the review of gait recognition system where different approaches and classification categories of Gait recognition like model free and model based approach, MDA, BPNN, LDA, and SVM.
2° Presentazione del workshop finale del progetto Step-by-Step
Obiettivo 2 - Fase Cronica: RIABILITAZIONE “EVIDENCE-BASED”
Lo sviluppo e la validazione clinica di un sistema di misura in grado di offrire una valutazione del percorso riabilitativo basata sull’evidenza, integrando le scale cliniche esistenti con parametri affidabili e ripetibili di segnali di movimento, correlabili con i valori delle scale.
ToBITas Case Study, Presentation for UCAMI 2014 conferenceBorja Gamecho
This is the presentation for the paper: "Evaluation of a Context-Aware Application for Mobile Robot Control Mediated by Physiological Data: The ToBITas Case Study."
In the UCAMI 2014 conference (2 to 5 of December).
Analysis of Inertial Sensor Data Using Trajectory Recognition Algorithmijcisjournal
This paper describes a digital pen based on IMU sensor for gesture and handwritten digit gesture
trajectory recognition applications. This project allows human and Pc interaction. Handwriting
Recognition is mainly used for applications in the field of security and authentication. By using embedded
pen the user can make hand gesture or write a digit and also an alphabetical character. The embedded pen
contains an inertial sensor, microcontroller and a module having Zigbee wireless transmitter for creating
handwriting and trajectories using gestures. The propound trajectory recognition algorithm constitute the
sensing signal attainment, pre-processing techniques, feature origination, feature extraction, classification
technique. The user hand motion is measured using the sensor and the sensing information is wirelessly
imparted to PC for recognition. In this process initially excerpt the time domain and frequency domain
features from pre-processed signal, later it performs linear discriminant analysis in order to represent
features with reduced dimension. The dimensionally reduced features are processed with two classifiers –
State Vector Machine (SVM) and k-Nearest Neighbour (kNN). Through this algorithm with SVM classifier
provides recognition rate is 98.5% and with kNN classifier recognition rate is 95.5% .
An Enhanced Computer Vision Based Hand Movement Capturing System with Stereo ...CSCJournals
This framework is a hand movement capturing method which could be done in three different depth levels. The algorithm has the capability of capturing and identifying when the hand is moving up, down, right and left. From these captured movements four signals could be generated. Moreover, when these hand movements are done, 15cm-75cm, 75cm-100cm, 100cm- 200cm from the camera (3 depth levels), twelve different signals could be generated. These generated signals could be used for applications such as game controlling (gaming).The existing method uses an object area based method for depth analysis. The results of the proposed work shows it has high accuracy compared to the existing method when tested for depth analysis.
Visual Verifications through Liveness Analysis using Mobile DevicesMahmudur Rahman
The visual information captured with camera-equipped mobile
devices has greatly appreciated in value and importance
as a result of their ubiquitous and connected nature. Today,
banking customers expect to be able to deposit checks using
mobile devices, and broadcasting videos from camera phones
uploaded by unknown users is admissible on news networks.
We present Movee, a system that addresses the fundamental
question of whether the visual stream coming into a mobile
app from the camera of the device can be trusted to be untampered
with, live data, before it can be used for a variety of purposes.
Movee is a novel approach to video liveness analysis for
mobile devices. It is based on measuring the consistency
between the data from the accelerometer sensor and the inferred
motion from the captured video. Contrary to existing
algorithms, Movee has the unique strength of not depending
on the audio track. Our experiments on real user data have
shown that Movee achieves 8% Equal Error Rate.
Smartphone-based Pedestrian Dead Reckoning as an Indoor Positioning SystemAlwin Poulose
Pedestrian Dead Reckoning as an Indoor Positioning System,Step detection and step estimation ,kims approach ,Smartphone-based Pedestrian Dead Reckoning
This presentation was prepared by Ishara Amarasekera based on the paper, Activity Recognition using Cell Phone Accelerometers by Jennifer R. Kwapisz, Gary M. Weiss and Samuel A. Moore.
This presentation contains a summary of the content provided in this research paper and was presented as a paper discussion for the course, Mobile and Ubiquitous Application Development in Computer Science.
Gait Recognition using MDA, LDA, BPNN and SVMIJEEE
Recognition of any individual is a task to identify the human beings. Human identification using Gait is method to identify an individual by the way he walk or manner of moving on foot of humans. Gait recognition is a type of biometric recognition and related to the behavioral characteristics of biometric recognition. Gait offers ability of distance recognition or at low resolution. In this paper it will present the review of gait recognition system where different approaches and classification categories of Gait recognition like model free and model based approach, MDA, BPNN, LDA, and SVM.
2° Presentazione del workshop finale del progetto Step-by-Step
Obiettivo 2 - Fase Cronica: RIABILITAZIONE “EVIDENCE-BASED”
Lo sviluppo e la validazione clinica di un sistema di misura in grado di offrire una valutazione del percorso riabilitativo basata sull’evidenza, integrando le scale cliniche esistenti con parametri affidabili e ripetibili di segnali di movimento, correlabili con i valori delle scale.
ToBITas Case Study, Presentation for UCAMI 2014 conferenceBorja Gamecho
This is the presentation for the paper: "Evaluation of a Context-Aware Application for Mobile Robot Control Mediated by Physiological Data: The ToBITas Case Study."
In the UCAMI 2014 conference (2 to 5 of December).
Analysis of Inertial Sensor Data Using Trajectory Recognition Algorithmijcisjournal
This paper describes a digital pen based on IMU sensor for gesture and handwritten digit gesture
trajectory recognition applications. This project allows human and Pc interaction. Handwriting
Recognition is mainly used for applications in the field of security and authentication. By using embedded
pen the user can make hand gesture or write a digit and also an alphabetical character. The embedded pen
contains an inertial sensor, microcontroller and a module having Zigbee wireless transmitter for creating
handwriting and trajectories using gestures. The propound trajectory recognition algorithm constitute the
sensing signal attainment, pre-processing techniques, feature origination, feature extraction, classification
technique. The user hand motion is measured using the sensor and the sensing information is wirelessly
imparted to PC for recognition. In this process initially excerpt the time domain and frequency domain
features from pre-processed signal, later it performs linear discriminant analysis in order to represent
features with reduced dimension. The dimensionally reduced features are processed with two classifiers –
State Vector Machine (SVM) and k-Nearest Neighbour (kNN). Through this algorithm with SVM classifier
provides recognition rate is 98.5% and with kNN classifier recognition rate is 95.5% .
An Enhanced Computer Vision Based Hand Movement Capturing System with Stereo ...CSCJournals
This framework is a hand movement capturing method which could be done in three different depth levels. The algorithm has the capability of capturing and identifying when the hand is moving up, down, right and left. From these captured movements four signals could be generated. Moreover, when these hand movements are done, 15cm-75cm, 75cm-100cm, 100cm- 200cm from the camera (3 depth levels), twelve different signals could be generated. These generated signals could be used for applications such as game controlling (gaming).The existing method uses an object area based method for depth analysis. The results of the proposed work shows it has high accuracy compared to the existing method when tested for depth analysis.
Visual Verifications through Liveness Analysis using Mobile DevicesMahmudur Rahman
The visual information captured with camera-equipped mobile
devices has greatly appreciated in value and importance
as a result of their ubiquitous and connected nature. Today,
banking customers expect to be able to deposit checks using
mobile devices, and broadcasting videos from camera phones
uploaded by unknown users is admissible on news networks.
We present Movee, a system that addresses the fundamental
question of whether the visual stream coming into a mobile
app from the camera of the device can be trusted to be untampered
with, live data, before it can be used for a variety of purposes.
Movee is a novel approach to video liveness analysis for
mobile devices. It is based on measuring the consistency
between the data from the accelerometer sensor and the inferred
motion from the captured video. Contrary to existing
algorithms, Movee has the unique strength of not depending
on the audio track. Our experiments on real user data have
shown that Movee achieves 8% Equal Error Rate.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
Search and Society: Reimagining Information Access for Radical Futures
Supporting System of Improvisational Ensemble Based on User's Motion Using Smartphone Sensors
1. Support System of Improvisational
Ensemble Based on User’s Motion
Using Smartphone Sensors
Souta MIZUNO1 Shugo ICHINOSE1
Shun SHIRAMATSU1 Tetsuro KITAHARA2
1. Nagoya Institute of Technology 2. Nihon University
2. Purpose
Developing an improvisational ensemble support system
– Even beginners are possible to play the improvisational ensemble with a
background tune
– Difficult element:tonality(chord progression)
→Our system corrects tonality of melody by outputting consonant tones
– Easy element:rhythm, up-and-down of melody(pitch contour)
→ Input using the physical gesture
– The previous study [Ichinose 17] developed an improvisational ensemble
support system with user’s body motion detected by a motion sensor camera
• Rhythm and pitch contour can be specified the body motion
– In this study, we aim to develop using more widely-used devices, i.e.,
smartphone sensors
[Ichinose 17] Ichinose et al., "Improvisation Ensemble Support Systems for Music Beginners Based on Body Motion
Tracking," in Proceedings of the 2017 6th IIAI International Congress on Advanced Applied Informatics, pp. 794-798
523 Hz
494 Hz
440 Hz
392 Hz
3. Issue
1. Estimating the vertical motion of user’s hand to determine pitch notation
- We need to develop an accurate position tracking method to estimate the
vertical motion of user’s hand using smartphone sensors
2. Deciding to input method of attack (output) timing
- By the intuitive motion
3. Outputting consonant tones satisfying tonality with a background tune
- By restricting output sounds
drop hand ⇒ down pitch
raise hand ⇒ up pitch
Outputting the consonant tones
(correct tonality)
4. 1. Method of Position Tracking and the
configuration of our system
– Propose approach to determine pitch
– Description method
– Evaluation experiment of Position Tracking
– About the system configuration
2. Input method of attack timing by intuitive motion
3. Estimating pitch notation by machine learning
4. Evaluation experiment
5. Unsolved issue
Outline
5. Two approaches to determine pitch notation
We try the following two approaches to determine the pitch notation
based on the vertical motion of user’s hand:
Approach1 Cumulative sum of acceleration:
– The pitch notation is determined by estimating the hand position of the vertical
direction from values of acceleration sensor, gyro sensor, etc
Approach2 Cumulative sum of acceleration + Machine learning:
– The pitch notation is determined by machine learning
– Values of smartphone sensors and cumulative sum of acceleration are used as
input values for machine learning
– Method of Bayesian Network
6. Position tracking
Used smartphone sensors
• gyro
• Acceleration
• Gravity
• Magnetic field
Acceleration of the vertical direction × Time
Moving distance ( the vertical motion of user’s hand)
The vertical motion of user’s hand is estimated based on value of
smartphone sensors
Moving distance:p
7. Evaluating accuracy of position tracking
• Comparing accuracy of position tracking
when users did a predefined motion (up ⇒ down ⇒ down ⇒ up …)
- The Blue line is tracking result by smartphone sensors
- The Orange line is tracking result by Kinect
• Estimation error gradually increase
- Error depend on the user
We use the approach2: machine learning
8. Background tune
System configuration
Machine learning
Estimate attack timing and pitch
notation based on user’s motion and
background tune
Position tracking
Detect the feature value of user’s
motion using smartphone sensors.
Improvisational ensemble
support system
User’s physical gesture
(input rhythm and pitch contour)
Output
a background tune
Output melodies based on estimated
attack timing and pitch notation
Input a chord progression of
background tune to machine learning
Input values of smartphone sensors
9. 1. Method of Position Tracking
2. Input method of attack timing by intuitive motion
3. Estimating pitch notation by machine learning
4. Evaluation experiment
5. Unsolved issue
Outline
10. Input method of attack timing
Output by shaking smart phone
Play by more intuitive motion
Detection is not easy by
sensors value
Output by tapping button on the
screen
Since simple method, possible
to accurately output
Lack of intuitive feel in motion
11. 1. Method of Position Tracking
2. Input method of attack timing by intuitive motion
3. Estimating pitch notation by machine learning
– Create training data
– Description about Bayesian networks
– Definition of Input value (moving distance)
4. Evaluation experiment
5. Unsolved issue
Outline
12. Create training data
To create training data, recording data samples of smartphone sensors when test
subjects move a smartphone along with a particular melody, i.e., edelweiss
(data is sampled per 5ms )
By learning a training data, three Bayesian network models were created
1. For estimating attack timing
2. For estimating pitch notation
3. For estimating the vertical motion of user’s hand
13. Output
h: the vertical motion
of user’s hand
ni: pitch notation
t: attack timing
Bayesian Network (user’s vertical motion)
C D A
• Raise motion
• Drop motion
• No change
• Attack timing
• No attack
Input
• From motion values
ay: acceleration (y-axis)
v: velocity
vc: velocity change
p: moving distance
g: gravity
t: attack timing
rm: the most number of
times estimated result
• From musical context
c : chord chart
(background tune)
ni-1: last pitch notation
14. Define of moving distance
Change the define of input value: p (moving distance)
1. Absolute distance: moving distance from start position
2. Relative distance: moving distance from last output position
- Aim: In case of using relative ldistance, error of position tracking is smaller
than absolute distance
First
attack timing
Second
attack timing
height
time
Absolute distance : 10cm
Relative distance : 10 -5 = 5cm
Start position
15. 1. Method of Position Tracking
2. Input method of attack timing by intuitive motion
3. Estimating pitch notation by machine learning
4. Evaluation experiment
– experiment1:melody generation using training data
– experiment2:evaluation of match rate of user’s
motion and change of pitch
5. Unsolved issue
Outline
16. Evaluation experiment
Experiment1. generate melody using training data
- Accuracy rate of the pitch notation estimation
- Accuracy rate of the vertical motion estimation
- The Recall and the precision of attack timing estimation
Experiment2. generate change of pitch matching user’s motion
- Match rate between user’s motion and estimated change of pitch
17. Evaluation of estimation accuracy
• Evaluation of estimation accuracy of pitch notation and the vertical user’s motion
- Using training data as test data
- Background tune:edelweiss
• Particle size set
- training data sample units
- midi note units
1sample (about every 5ms)
estimation result using first sample
estimation result of note
C
D
E
1note (midi note)
pitch
time
number of note:3
Number of sumple:24
18. Evaluation estimation accuracy
recall =
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑎𝑡𝑡𝑎𝑐𝑘 𝑡𝑖𝑚𝑖𝑛𝑔 𝑒𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 𝑏𝑦
𝑜𝑢𝑟 𝑠𝑦𝑠𝑡𝑒𝑚 𝑓𝑜𝑙𝑙𝑜𝑤 𝑏𝑎𝑐𝑘𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑢𝑛𝑒
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑎𝑡𝑡𝑎𝑐𝑘 𝑡𝑖𝑚𝑖𝑛𝑔
𝑜𝑓 𝑏𝑎𝑐𝑘𝑔𝑟𝑜𝑢𝑛𝑑 𝑡𝑢𝑛𝑒
precision =
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑎𝑡𝑡𝑎𝑐𝑘 𝑡𝑖𝑚𝑖𝑛𝑔 𝑒𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 𝑏𝑦
𝑜𝑢𝑟 𝑠𝑦𝑠𝑡𝑒𝑚 𝑓𝑜𝑙𝑙𝑜𝑤 𝑏𝑎𝑐𝑘𝑔𝑜𝑟𝑢𝑛𝑑 𝑡𝑢𝑛𝑒
𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑎𝑡𝑡𝑎𝑐𝑘 𝑡𝑖𝑚𝑖𝑛𝑔
𝑒𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 𝑏𝑦 𝑜𝑢𝑟 𝑠𝑦𝑠𝑡𝑒𝑚
・・・
・・・
Estimated attack timings by our system
・・・
・・・
• Evaluation of estimation accuracy of attack timing
- Particle size is note unit
- The recall and the precision to the background tune were
examined
Attack timings of background tune
・・・
・・・
・・・
・・・
Estimated attack timings by our system
Estimated attack timings by our system
19. absolute distance relative distance absolute distance relative distance
shake motion 0.45 (121/270)
not conducted
(future work)
0.56 (152/270)
not conducted
(future work)
touch motion 0.56 (152/270) 0.71 (192/270) 0.68 (184/270) 0.91 (247/270)
note unit
estimation accuracy of pitch notation
(correct notes / all notes)
estimation accuracy of vertical motion
(correct samples / all samples)
sample unit
estimation accuracy of pitch notation
(correct samples / all samples)
estimation accuracy of vertical motion
(correct samples / all samples)
shake motion 0.70 (45888/65278) 0.60 (42975/65278)
touch motion 0.82 (53764/65251) 0.78 (50909/65251)
Result of experiment1
recall precision
shake motion 0.63 (171/270) 0.26 (171/661)
estimation accuracy of attack timing
• Both shake and touch, accuracy
rate of sample units is higher
than that of note units
• The value in case of using relative
distance is higher than that in case
of absolute distance
• The Precision is very low because even small
motion are recognized shake motion
20. Evaluation match rate between user’s motion
and estimated change of pitch
- Comparing case of using absolute distance and relative distance
- Using musical context: case of using last pitch notation ni-1
- Not using musical context: case of not using last pitch notation ni-1
- Case of using absolute distance: not much difference
- Case of using relative distance: improving accuracy rate
shake motion absolute distance relative distance
using musical context 0.48 (31/64)
not using musical context 0.53 (34/64)
not conducted
(future work)
touch motion absolute distance relative distance
using musical context 0.55 (35/64) 0.53 (34/64)
not using musical context 0.51 (33/64) 0.75 (48/64)
It is the reason of this result
• Estimated melody transition is affected by melody in training data
• Training data has only one tune: edelwess
21. 1. Method of Position Tracking
2. Input method of attack timing by intuitive motion
3. Estimating pitch notation by machine learning
4. Evaluation experiment
5. Unsolved issue
Outline
22. Unsolved issue
• We could not support ensemble case of multiple users
– Currently, improvisational ensemble with a user and background
tune
– The system must correct consonance relation of each user’s
output sound
• There was not accurate training data of the vertical motion
of user’s hand
– Currently, we cannot confirm correctness of the estimated
movement of user’s hand
• because we assume that the vertical motion of user’s hand is the same
vertical motion as sound of background tune (without observation)
– Using data Recorded by Kinect camera as supervised data
23. Recording supervised data by Kinect
We are currently making a training dataset by using Kinect
- As reference data of the vertical motion of user ‘s hand
- This part is not included in our paper yet
Recognized hand position by kinect
24. Data recording experiment for
machine learning
1. The subject did “shake” and “tap” in accordance with the
quadruple rhythm
–the test subjects vertically move smartphone in a predefind motion
“high→high→middle→low” or “high→middle→low→low“
–smart sensor data and tap timing were recorded
high
middle
low
high
middle
low
Shake
Shake
Shake
Tap
Tap
Tap
(This part is not included in our paper)
25. Data collection experiment for
machine learning
2. The hand of test subjects was recognized by
Kinect and record the height of hand as the
accurate training data
(This part is not included in our paper)
26. Characteristics of recorded data
The difference of smartphone’s angle according
to the height of user’s hand
high lowmiddle
(This part is not included in our paper)
27. Discussion: importance of hand angle
Found the height of user’s hand might relate to the hand angle
- We need to importance smartphone’s angle for the more accurate
estimation of the vertical motion
(This part is not included in our paper)
28. Conclusion
• We developed the system estimates the vertical motion of user’s hand
using smartphone sensors
• We created Bayesian network estimates attack timing and pitch notation
by values of smartphone sensors as input values
– Developed case of using shake motion and case of using touch motion
– Estimation accuracy of attack timing is unstable case of shake motion
Future works
• We will improve position tracking method using the accurate
training data that is currently collected
• We need to implement an estimation method suitable for time
series data
– HMM,LSTM,etc
• We aim to implement an ensemble function by multiple users
Conclusion and future work
29. Bayesian Network (attack timing)
Input
ax: acceleration (x-axis)
ay: acceleration (y-axis)
v: velocity
vc: velocity change
p: moving distance
g: gravity
Output
t: attack timing (0 or 1)
Attack timing
or
No attack
Input value
30. Bayesian Network (pitch notation)
Output
ni: pitch notation
Input
• From motion values
ay: acceleration (y-axis)
v: velocity
vc: velocity change
p: moving distance
g: gravity
t: attack timing
rm: the most number of times estimated result
• From musical context
c : chord chart
(background tune)
ni-1: last pitch notation
Input values
C
D
pitch notation
31. Bayesian Network (user’s vertical motion)
Input
ay: acceleration (y-axis)
v: velocity
vc: velocity change
p: moving distance
g: gravity
Output
h: the vertical motion of
user’s hand
Raise motion
or
Drop motion
or
No change
32. Multi users ensemble by multi users using
inter-device communication
Planning to implementation ensemble system by multi users
• The system support ensemble by inter-device communication
- Send and receive information of musical performance (chord progression)
• Planning to implementation communication method by Bluetooth
- The system must take into account delay time by sending data
Send and receive information
of musical performance
(chord progression)
Ensemble support system
(user 1)
Input values of
smartphone sensors
Ensemble support system
(user 2)
Input values of
smartphone sensors
33. Recording supervised data by Kinect
• We made an experiment recording supervised data by Kinect
- recorded the height value of user’s hand recognized by Kinect as supervised
data when users did a predefined motion
• As a result of experiment
- Observed unexpected difference of smartphone’s angle according to the height
of user’s hand
- Smartphone’s angle is important for the more accurate estimation of the
vertical motion
high
middle
low
Predefined motion:
high→high→middle→low