Tutorials

Modeling User Behavior for Measuring Effectiveness (Full Day)

This tutorial combines an introduction to classical IR evaluation methods with material on more recent user-oriented approaches. We start with a review of standard approaches to evaluation, including a review of traditional evaluation methods from a user perspective. We then discuss sources of user behavior that can be used to develop and calibrate user models for IR evaluation, including lab studies and implicit feedback. Finally, we introduce techniques for developing evaluation methodologies that directly model key aspects of the user's interaction with the system. The broad goal of the tutorial is to equip researchers with an understanding of modern approaches to IR evaluation, facilitating new research on this topic and improving evaluation methodology for emerging areas. This tutorial provides an extended version of a half-day tutorial given at SIGIR 2015.

Instructors

Charles Clarke is a Professor in the School of Computer Science at the University of Waterloo, Canada. His research interests include information retrieval, web search, and text data mining. He has published on a wide range of topics, including papers related to question answering, XML, filesystem search, HCI, and statistical NLP, as well as the evaluation of information retrieval systems. He was a Program Co-Chair for SIGIR 2007 and 2014. He is currently Chair of the SIGIR Executive Committee, and Co-Editor-in-Chief of the "Information Retrieval Journal". He is a co-author of the graduate textbook "Information Retrieval: Implementing and Evaluating Search Engines", MIT Press, 2010.

Mark Smucker is an associate professor in the Department of Management Sciences at the University of Waterloo. Mark's recent work has focused on making information retrieval evaluation more predictive of actual human search performance. Mark has been a co-organizer of two TREC tracks, a co-organizer of the SIGIR 2013 workshop on modeling user behavior for information retrieval evaluation (MUBE) and the SIGIR 2010 workshop on the simulation of interaction. He is a recipient of the SIGIR best paper award (2012) for his work with Charles Clarke on the time-based calibration of effectiveness measures. He is also a recipient of the University of Waterloo, Faculty of Engineering's Teaching Excellence Award.

Emine Yilmaz is an assistant professor in the Department of Computer Science University College London and a research consultant for Microsoft Research Cambridge. She is the recipient of the Google Faculty Award in 2014/15. Her main interests are evaluating quality of retrieval systems, modelling user behaviour, learning to rank, and inferring user needs while using search engines. She has published research papers extensively at major information retrieval venues such as SIGIR, CIKM and WSDM. She has previously given several tutorials on evaluation at the SIGIR 2012 and SIGIR 2010 Conferences and at the RuSSIR/EDBT Summer School in 2011. She has also organized several workshops on Crowdsourcing (WSDM2011, SIGIR 2011 and SIGIR 2010) and User Modelling for Retrieval Evaluation (SIGIR 2013). She has served as one of the organizers of the ICTIR Conference in 2009, as the demo chair for the ECIR Conference in 2013, and as the PC chair for the SPIRE 2015 conference. She is also a co-coordinator of the Tasks Track in TREC 2015.


Applying Qualitative Methods in Studies of Retrieval Interactions (Half Day: Afternoon)

Researchers in interactive information retrieval (IR) most often design experiments to evaluate the effects of system features when used by information seekers. If data pertaining to user perceptions are gathered during a study, they are most often collected via a short questionnaire or, possibly, a post-task debriefing interview. Researchers in human-computer interaction (HCI) may be more familiar with a wide range of qualitative research methods, since many anthropologists/ethnographers have engaged with this field. In particular, HCI research teams may have investigated use of new devices or software in the field over extended periods of use.

A variety of different qualitative methods for data collection and analysis could prove useful to those interested in how people interact with retrieval systems (or find alternative means to address their information needs). For example, direct observation of a naturally-occurring collaborating information seeking session would allow the observer to understand the diverse set of tools used to seek/find information, as well as the nature of the communications between the collaborators.

This tutorial will use a variety of hands-on exercises, as well as discussion, to introduce both interactive IR researchers and HCI researchers to a variety of qualitative research methods that will be useful to them in their future studies.

Instructor

The tutorial will be presented by Barbara Wildemuth, a Professor in the School of Information and Library Science at the University of North Carolina. She is the author of a text, entitled, “Application of Social Research Methods to Questions in Information and Library Science” (Libraries Unlimited, 2009), which includes a number of chapters on qualitative methods and their application to information-related research questions. She regularly teaches a doctoral seminar on grounded theory methods and recently presented a conference workshop on that topic at the Libraries in the Digital Age conference in Croatia (2014). Her research includes many studies of user interactions with systems and other information sources, many incorporating qualitative methods.