A very big welcome to everyone to the first ever SIGIR workshop on Neural Information Retrieval.
This year we kicked-off SIGIR with a very optimistic note from Christopher Manning about the field of Neural IR, which was also justifiably accompanied by a note of caution about some of the disproportionate exuberance in the field.
As organizers, when we submitted our workshop proposal we were optimistic that this is an area that a big part of the SIGIR community is excited about. But we were still pleasantly overwhelmed by the actual response we received for a first time workshop.
It is super exciting to see the broad range of themes…
…in the accepted papers.
Interestingly if we look for the same topics in the SIGIR main conference papers…
…we find that they are still relatively under-represented. This in fact validates one of our intuitions that led us to propose this workshop - that in spite of all the excitement, the field of neural IR is actually progressing slowly and might benefit from bringing a focused community together.
So given how much we love our metrics in the SIGIR community, I would love to quickly share some numbers about today’s workshop.
We have more than 120 registrations as of last Tuesday.
We had close to 30 submissions in response to our call for papers…
…of which 19 will be presented today in the form of posters. In addition, the organizers have also selected 5 papers, keeping both the interestingness and the diversity of the papers in mind, for oral presentations.
…Our acceptance rate for submissions was 73%.
Among the accepted papers we noticed some broad themes. 8 papers focused on improving or leveraging word embeddings.
More than half of the papers studied deeper models for IR tasks.
A large number of papers evaluated on the document ranking task, but there were also papers on QnA, proactive IR, knowledge-based IR, and conversational search.
Based on first author’s affiliation most papers came from academia, although a few of them had co-authors from the industry.
We also noticed a good geographic spread, which included 4 papers each from France and India.
A large number of submissions means a large number of papers to review. We had at least 2 reviews for every submitted paper and every reviewer on average reviewed 4-5 papers.
I would like to say a special thanks to all the PC members for their invaluable effort in finishing all the reviews on time in spite of the larger than expected number of submissions, and keeping us on track for all our deadlines. Thank you!
This is also a good time to announce the call for paper for a special issue of the Information Retrieval Journal on Neural Information Retrieval. I hope that many of you in this room will consider submitting to this journal.
Through the course of today I request everyone to ask questions, express opinions and initiate discussions. Tell us live via Twitter what’s going well and what we should do differently. We will try our best to adapt as the day progresses based on your live feedback.
OK, we have a packed day and I know everyone one of you is eagerly waiting to hear from Tomas Mikolov. So I will shut up now. :)
Tomas as you all already know is a research scientist at Facebook AI Research. He has previously been part of the Google Brain team and a long time back also interned at Microsoft Research. But probably what many of us in the IR community know Tomas for is his work on word2vec and other vector representations of text which many of us have used or extended in our own work. Tomas has also promised me that he will hang around after his talk – so hopefully many of you will get an opportunity to have more detailed discussions with him during the course of this day. So without further adieu I invite Tomas to come up on stage and tell us more about his Recurrent networks and beyond.
Neu-ir 2016: Opening note
The First SIGIR
(Pronounced “new IR”)
W. Bruce Croft
University of Massachusetts
Chinese Academy of Sciences
Maarten de Rijke
University of Amsterdam
Amsterdam, The Netherlands
A big welcome from all
I’m certain that deep learning will come to
dominate SIGIR over the next couple of years …
just like speech, vision, and NLP before it. This is
a good thing. Deep learning provides some
powerful new techniques that are just being
amazingly successful on many hard applied
problems. However, we should realize that there
is also currently a huge amount of hype about
deep learning and artificial intelligence. We
should not let a genuine enthusiasm for
important and successful new techniques lead
to irrational exuberance or a diminished
appreciation of other approaches…
to encourage more
Lessons from the Trenches
to share learnings
Poster sessions to facilitate
The rest of this day…MorningI