The First SIGIR
Workshop on
Neural
Information
Retrieval
Neu-IR 2016
(Pronounced “new IR”)
https://www.microsoft.com/en-us/research/neuir2016
https://twitter.com/neuir2016
W. Bruce Croft
University of Massachusetts
Amherst, US
Jiafeng Guo
Chinese Academy of Sciences
Beijing, China
Maarten de Rijke
University of Amsterdam
Amsterdam, The Netherlands
Bhaskar Mitra
Bing, Microsoft
Cambridge, UK
Nick Craswell
Bing, Microsoft
Bellevue, US
A big welcome from all
The Organizers
I’m certain that deep learning will come to
dominate SIGIR over the next couple of years …
just like speech, vision, and NLP before it. This is
a good thing. Deep learning provides some
powerful new techniques that are just being
amazingly successful on many hard applied
problems. However, we should realize that there
is also currently a huge amount of hype about
deep learning and artificial intelligence. We
should not let a genuine enthusiasm for
important and successful new techniques lead
to irrational exuberance or a diminished
appreciation of other approaches…
We love the excitement!
Neu-IR 2016
Word Cloud
Neu-IR 2016
Word Cloud
SIGIR 2016
Word Cloud
SIGIR 2016
Word Cloud
Neu-IR 2016 in numbers
# of registrations
121as of July 19 (Tuesday)
# of submissions
27excluding 3 incomplete submissions
# of accepted papers
19excluding 1 withdrawn by the author
% of accepted papers
73%excluding 1 withdrawn by the author
# of accepted papers on word embeddings
8
# of accepted papers on deep NNs
10
# of accepted papers on document ranking
7Other tasks: QA, proactive, conversational…
# of accepted papers from academia
17based on first author’s affiliation
# of countries based on first author
9spread across 3 continents
# of reviewers
14including 9 PC members
Program Committee
Call for Papers
Discuss,
Share,
Learn.
Non-archival submissions
to encourage more
presentations
Lessons from the Trenches
to share learnings
Poster sessions to facilitate
1:1 discussions
https://twitter.com/neuir2016
The rest of this day…MorningI
Keynote: Tomas
Mikolov
Paper
presentations
Coffee break
MorningII
Lessons from
the Trenches
Poster
presentations
Lunch
AfternoonI
Keynote: Hang
Li
Paper
presentations
Coffee break
AfternoonII
Breakout
session
Breakout
session
retrospective
Closing
comments

Neu-ir 2016: Opening note

  • 1.
    The First SIGIR Workshopon Neural Information Retrieval Neu-IR 2016 (Pronounced “new IR”) https://www.microsoft.com/en-us/research/neuir2016 https://twitter.com/neuir2016
  • 2.
    W. Bruce Croft Universityof Massachusetts Amherst, US Jiafeng Guo Chinese Academy of Sciences Beijing, China Maarten de Rijke University of Amsterdam Amsterdam, The Netherlands Bhaskar Mitra Bing, Microsoft Cambridge, UK Nick Craswell Bing, Microsoft Bellevue, US A big welcome from all The Organizers
  • 3.
    I’m certain thatdeep learning will come to dominate SIGIR over the next couple of years … just like speech, vision, and NLP before it. This is a good thing. Deep learning provides some powerful new techniques that are just being amazingly successful on many hard applied problems. However, we should realize that there is also currently a huge amount of hype about deep learning and artificial intelligence. We should not let a genuine enthusiasm for important and successful new techniques lead to irrational exuberance or a diminished appreciation of other approaches…
  • 4.
    We love theexcitement!
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
    # of registrations 121asof July 19 (Tuesday)
  • 11.
    # of submissions 27excluding3 incomplete submissions
  • 12.
    # of acceptedpapers 19excluding 1 withdrawn by the author
  • 13.
    % of acceptedpapers 73%excluding 1 withdrawn by the author
  • 14.
    # of acceptedpapers on word embeddings 8
  • 15.
    # of acceptedpapers on deep NNs 10
  • 16.
    # of acceptedpapers on document ranking 7Other tasks: QA, proactive, conversational…
  • 17.
    # of acceptedpapers from academia 17based on first author’s affiliation
  • 18.
    # of countriesbased on first author 9spread across 3 continents
  • 19.
  • 20.
  • 21.
  • 22.
    Discuss, Share, Learn. Non-archival submissions to encouragemore presentations Lessons from the Trenches to share learnings Poster sessions to facilitate 1:1 discussions https://twitter.com/neuir2016
  • 23.
    The rest ofthis day…MorningI Keynote: Tomas Mikolov Paper presentations Coffee break MorningII Lessons from the Trenches Poster presentations Lunch AfternoonI Keynote: Hang Li Paper presentations Coffee break AfternoonII Breakout session Breakout session retrospective Closing comments

Editor's Notes

  • #3 A very big welcome to everyone to the first ever SIGIR workshop on Neural Information Retrieval.
  • #4 This year we kicked-off SIGIR with a very optimistic note from Christopher Manning about the field of Neural IR, which was also justifiably accompanied by a note of caution about some of the disproportionate exuberance in the field.
  • #5 As organizers, when we submitted our workshop proposal we were optimistic that this is an area that a big part of the SIGIR community is excited about. But we were still pleasantly overwhelmed by the actual response we received for a first time workshop.
  • #6 It is super exciting to see the broad range of themes…
  • #7 …in the accepted papers.
  • #8 Interestingly if we look for the same topics in the SIGIR main conference papers…
  • #9 …we find that they are still relatively under-represented. This in fact validates one of our intuitions that led us to propose this workshop - that in spite of all the excitement, the field of neural IR is actually progressing slowly and might benefit from bringing a focused community together.
  • #10 So given how much we love our metrics in the SIGIR community, I would love to quickly share some numbers about today’s workshop.
  • #11 We have more than 120 registrations as of last Tuesday.
  • #12 We had close to 30 submissions in response to our call for papers…
  • #13 …of which 19 will be presented today in the form of posters. In addition, the organizers have also selected 5 papers, keeping both the interestingness and the diversity of the papers in mind, for oral presentations.
  • #14 …Our acceptance rate for submissions was 73%.
  • #15 Among the accepted papers we noticed some broad themes. 8 papers focused on improving or leveraging word embeddings.
  • #16 More than half of the papers studied deeper models for IR tasks.
  • #17 A large number of papers evaluated on the document ranking task, but there were also papers on QnA, proactive IR, knowledge-based IR, and conversational search.
  • #18 Based on first author’s affiliation most papers came from academia, although a few of them had co-authors from the industry.
  • #19 We also noticed a good geographic spread, which included 4 papers each from France and India.
  • #20 A large number of submissions means a large number of papers to review. We had at least 2 reviews for every submitted paper and every reviewer on average reviewed 4-5 papers.
  • #21 I would like to say a special thanks to all the PC members for their invaluable effort in finishing all the reviews on time in spite of the larger than expected number of submissions, and keeping us on track for all our deadlines. Thank you!
  • #22 This is also a good time to announce the call for paper for a special issue of the Information Retrieval Journal on Neural Information Retrieval. I hope that many of you in this room will consider submitting to this journal.
  • #23 Through the course of today I request everyone to ask questions, express opinions and initiate discussions. Tell us live via Twitter what’s going well and what we should do differently. We will try our best to adapt as the day progresses based on your live feedback.
  • #24 OK, we have a packed day and I know everyone one of you is eagerly waiting to hear from Tomas Mikolov. So I will shut up now. :) Tomas as you all already know is a research scientist at Facebook AI Research. He has previously been part of the Google Brain team and a long time back also interned at Microsoft Research. But probably what many of us in the IR community know Tomas for is his work on word2vec and other vector representations of text which many of us have used or extended in our own work. Tomas has also promised me that he will hang around after his talk – so hopefully many of you will get an opportunity to have more detailed discussions with him during the course of this day. So without further adieu I invite Tomas to come up on stage and tell us more about his Recurrent networks and beyond.