Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evaluating Heterogeneous Information Access (Position Paper)


Published on

Information access is becoming increasingly heterogeneous. We need to better understand the more complex user behaviour within this context so that to properly evaluate search systems dealing with heterogeneous information. In this paper, we review the main challenges associated with evaluating search in this context and propose some avenues to incorporate user aspects into evaluation measures.

Full paper at
To presented at SIGIR workshop MUBE, 2013.

PhD work/reflection/conclusions of Ke (Adam) Zhou.

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

Evaluating Heterogeneous Information Access (Position Paper)

  1. 1. Evalua&ng  Heterogeneous   Informa&on  Access  (Posi&on  Paper) Ke  Zhou1,  Tetsuya  Sakai2,  Mounia  Lalmas3,     Zhicheng  Dou2  and  Joemon  M.  Jose1   1University  of  Glasgow   2MicrosoN  Research  Asia   3Yahoo!  Labs  Barcelona SIGIR  2013   MUBE  workshop
  2. 2. IR  Evalua&on •  System-­‐oriented  Evalua&on  (test  collec&on  +   metrics)   •  User-­‐oriented  Evalua&on  (interac&ve  user   study)   •  Current  endeavor  to  incorporate  user  into   system-­‐oriented  metrics   – Time-­‐Biased  Gain  (Smucker,  Clarke.)   – U-­‐measure  (Sakai,  Dou)   – etc.
  3. 3. Increasing  Heterogeneous  Nature     on  Search
  4. 4. Increasing  Heterogeneous  Nature     on  Search ……  
  5. 5. Posi&on •  Compared  with  tradi&onal  homogeneous   search,  evalua&on  in  the  context  of   heterogeneous  informa&on  is  more   challenging  and  requires  taking  into  account   more  complex  user  behaviors  and   interac4ons.    
  6. 6. Challenges •  Non-­‐linear  Traversal  Browsing     •  Diverse  Search  Tasks     •  Coherence   •  Diversity   •  Personaliza&on     •  etc.
  7. 7. Various  Presenta&on  Strategies Non-­‐linear  BlendedBlended ……   Tabbed
  8. 8. User  Browsing  Pa^ern “E”  Browsing  Pa?ern     on  Aggregated  Search  Page “F”  Browsing  Pa?ern   on  Organic  Search  Page h^p://­‐tracking-­‐on-­‐universal-­‐and-­‐personalized-­‐search-­‐12233
  9. 9. Non-­‐linear  Traversal  Browsing   h^p://­‐tracking-­‐on-­‐universal-­‐and-­‐personalized-­‐search-­‐12233
  10. 10. Search  Tasks:  Ver&cal  Orienta&on CIKM’10  (Sushmita  et  al.),  WWW’13  (Zhou  et  al.)  
  11. 11. Search  Tasks:  Complexity SIGIR’12  (Arguello  et  al.)
  12. 12. Coherence Car Animal Car Car Car Car Car Car Car Car Car Car vs. CIKM’12  (Arguello  et  al.)  
  13. 13. Coherence Car Animal Animal Car Animal Car Car Car Animal Car Animal Car vs. CIKM’12  (Arguello  et  al.)  
  14. 14. Diversity vs. Image News+Map+Image SIGIR’12  (Zhou  et  al.)  
  15. 15. Personaliza&on vs. SIGIRerAverage  User
  16. 16. Avenues  of  Research •  Be^er  understanding  of  users   –  Click  models:  WSDM’12  (Chen  et  al.),  SIGIR’13  (Wang   et  al.)   –  Ver&cal  orienta&on:  CIKM’10  (Sushmita  et  al.),   WWW’13  (Zhou  et  al.)   –  Task  complexity:  SIGIR’12  (Arguello  et  al.)   –  Task  coherence:  CIKM’12  (Arguello  et  al.)   –  Diversity:  SIGIR’12  (Zhou  et  al.)   –  Personaliza&on   –  Non-­‐linear  presenta&on  strategies  
  17. 17. Avenues  of  Research •  Be^er  incorpora&on  of  learned  user  behavior   into  evalua&on  metrics   – follow  SIGIR’13  (Chuklin  et  al)  and  convert   obtained  aggregated  search  click  models  into   system-­‐oriented  evalua&on  metrics.     – model  addi&onal  features  into  powerful   evalua&on  framework  (e.g.  TBG,  U-­‐measure,  AS-­‐ metric).
  18. 18. Thank  you!  Ques&ons? Ke  Zhou, TREC  2013  FedWeb  track:   h^ps://