Big data and new technologies to do survey research. These were in my view the two themes of the 2014 AAPOR conference . The conference organisation tried to push the theme ‘Measurement and the role of pubic opinion in a democracy’, but I don’t think the theme was really reflected in the talks at the conference. Or perhaps I have missed those talks, the conference was huge as always (> 1000 participants).
One of the greatest challenges in survey research are declining response rates. Around the globe, it appears to become harder and harder to convince people to participate in surveys. As to why response rates are declining, researchers are unsure. A general worsening of the ‘survey climate’, due to increased time pressures on people in general, and direct marketing are usually blamed.
This year’s Nonresponse workshop was held in London last week.
The AAPOR conference last week gave an overview of what survey methodologists worry about. There were relatively few people from Europe this year, and I found that the issues methodologists worry about are sometimes different in Europe and the USA. At the upcoming ESRA conference for example there are more than 10 sessions on the topic of mixing survey modes. At AAPOR, mixing modes was definitely not ‘hot’.
With 8 parallel sessions at most times, I have only seen bits and pieces of all the things that went on.
This weekend is the deadline for submitting a presentation proposal to this year’s conference of the European Survey Research Association. That’s one the two major the conferences for people who love to talk about things like nonresponse bias, total survey error, and mixing survey modes.
As in previous years, it looks like the most heated debates will be on mixed-mode surveys. As survey methodologists we have been struggling to combine multiple survey modes (Internet, telephone, face-to-face, mail) in a good way.
All of my research is focused on the methods of assembling and analysis of panel survey data. One of the primary problems of panel survey projects is attrition or drop-out. Over the course of a panel survey, many respondents decide to no longer participate.
Last july I visited the panel survey methods workshop in Melbourne, at which we had extensive discussions about panel attrition. How to study it, what the consequences are (bias) for survey estimates, and how to prevent it from happening altogether.
In late august of 2011 I attended the Internet Survey Methodology Workshop. There were people from academia, official statistics and market research agencies there. One of the issues discussed there has had me thinking since: the topic of panel conditioning. Some people seem really worried that respondents in panel surveys start behaving or thinking differently because of repeated participation in a survey.
Panel conditioning is closely linked with the issue of ‘professional’ respondents.
Gerry Nicolaas (of Natcen) has just written a good review on the nonresponse workshop we both attended this year. See http://natcenblog.blogspot.com/2011/10/challenges-to-current-practice-of.html#comment-form The Nonresponse Workshops are a great place to meet and discuss with survey researchers in a small setting. The next workshop is to be held early september 2012 at Statistics Canada. See www.nonresponse.org