Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Melcomplexity Escom 20090729

Talk held at the ESCOM 2009 in Jyäskylä

  • Login to see the comments

Melcomplexity Escom 20090729

  1. 1. Melodic Complexity Klaus Frieler Universität Hamburg Musikwissenschaftliches Institut ESCOM 2009, Jyväskylä 12.8.2009
  2. 2. <ul><li>Melodic Complexity </li></ul><ul><li>Perceived complexity is a complex process generated within a signal/receiver system </li></ul><ul><li>Hypothesis: Perceived complexity is a function of (objective) signal complexity </li></ul>
  3. 3. <ul><li>Melodic Complexity </li></ul><ul><li>Idea : Test various algorithmical melodic complexity measures in psychological experiments </li></ul><ul><li>If there are significant correlations, build a model </li></ul>
  4. 4. <ul><li>Melodic Complexity </li></ul><ul><li>Complexity algorithms for n-gram sequences: </li></ul><ul><li>Entropies </li></ul><ul><li>Zipf complexity </li></ul><ul><li>N-gram redundancy </li></ul>
  5. 5. <ul><li>Algorithm Construction </li></ul><ul><li>Given: Melody as onset/pitch sequences with metrical annotations </li></ul><ul><li>Apply basic transformations </li></ul><ul><li>Here: pitches, intervals, durations, metrical circle map (cf. later) </li></ul>
  6. 6. <ul><li>Algorithm Construction </li></ul><ul><li>Main transformation: n-gram sequences, i.e. sequences of subsequences of length n </li></ul><ul><li>Calculate histograms of n-grams </li></ul><ul><li>Here: n = 1, 2, 3, 4, variable </li></ul>
  7. 7. <ul><li>Entropies </li></ul><ul><li>Entropies of n-gram distribution </li></ul><ul><li>Norm by max. entropy </li></ul>
  8. 8. <ul><li>Zipf complexity </li></ul><ul><li>Zipf‘s law: The ordered sequence of term frequencies obeys a power law (k = rank): </li></ul><ul><li>h(k) ~ k -s </li></ul><ul><li>log h(k) ~ - s log k </li></ul>
  9. 9. <ul><li>Zipf complexity </li></ul>Source: Wikipedia
  10. 10. <ul><li>Zipf complexity </li></ul><ul><li>Ordered n-gram frequencies </li></ul><ul><li>Regression on log-log data with slope s </li></ul><ul><li>Define c := 2 s as Zipf complexity </li></ul><ul><li>s = 0  c = 1, s = -1  c = 0.5, </li></ul><ul><li>s = -   c = 0 </li></ul>
  11. 11. <ul><li>N-gram redundancy </li></ul><ul><li>Number of distinct elements in a sequence is a simple measure of redundancy. </li></ul><ul><li>The more distinct elements the more „complex“ </li></ul>
  12. 12. <ul><li>N-gram redundancy </li></ul><ul><li>Let |n(s)| be the count of distinct n-grams in a sequence s of length N . Then </li></ul>
  13. 13. <ul><li>N-gram redundancy </li></ul><ul><li>Extensions: Weighted sum of n-gram redundancies up to a fixed or variable n max </li></ul>
  14. 14. Metrical Circle Map
  15. 15. Ex.: „Mandy“ by Barry Manilow
  16. 16. <ul><li>Experiments </li></ul><ul><ul><li>Two listening experiments with a total of 47 subjects </li></ul></ul><ul><ul><li>Stimuli: 12 folk songs and 3 jazz saxophon chorusses, 9 melodies identical in both experiments </li></ul></ul><ul><ul><li>Task: Judgement of melodic complexity on a scale from 1-7 </li></ul></ul>
  17. 17. <ul><li>Results </li></ul><ul><li>Normal distributed, reliable judgements </li></ul><ul><li> Pooling of data from both experiments and </li></ul><ul><li> Using subject means for further comparisions </li></ul>
  18. 18. <ul><li>Results </li></ul><ul><li>42 complexity measures: </li></ul><ul><ul><li>Note count </li></ul></ul><ul><ul><li>Metrical Markov entropies (0th, 1th order) </li></ul></ul><ul><ul><li>Zipf complexities (int, pitch, dur) </li></ul></ul><ul><ul><li>N-gram redundancies (int, pitch, dur) </li></ul></ul><ul><ul><li>Entropies (int, pitch, dur) </li></ul></ul>
  19. 19. <ul><li>Correlations </li></ul>Note count r = ,869**
  20. 20. <ul><li>Results </li></ul><ul><li>Note count explains judgement nearly perfect! </li></ul><ul><li>Calculate partial correlations for other measures  Only metrical entropies left </li></ul>
  21. 21. <ul><li>Correlations </li></ul>0 th order metrical entropy r = ,934** r‘= ,837**
  22. 22. <ul><li>Correlations </li></ul>1 st order metrical entropy r = ,944** r‘= ,867**
  23. 23. <ul><li>Correlations </li></ul>Pitch entropy r = ,132 r‘= ,092
  24. 24. <ul><li>Linear Regression </li></ul><ul><li>Stepwise regression of variables with highest correlation </li></ul><ul><li>Corrected R 2 = .929 </li></ul><ul><li>zsubjmean = .345 * znotecount </li></ul><ul><li> + .677 * zmeter1ent </li></ul>
  25. 25. <ul><li>Conclusion </li></ul><ul><li>Good agreement with measured complexity </li></ul><ul><li>1 st order Metrical Markov entropy shows highest correlation </li></ul><ul><li>But: Note count explains most of all correlations </li></ul><ul><li> Rather simple complexity ?! </li></ul>
  26. 26. <ul><li>Conclusion </li></ul><ul><li>No partial correlation with any pitch/interval based measure could be found </li></ul><ul><li>Meter is the most important dimension… </li></ul>
  27. 27. <ul><li>Outlook </li></ul><ul><li>We plan experiments with note counts kept constant </li></ul><ul><li>Pretests show, that metrical entropies might be suited to predict „hit-potential“ of pop melodies </li></ul>
  28. 28. Thank you!
  29. 29. Metrical Intervals
  30. 30. Metrical Markov chain 0 th order
  31. 31. Metrical Markov chains 1 st order

×