International Choice Modelling Conference, International Choice Modelling Conference 2017

Font Size: 
A Query Approach to Modeling Attendance to Attributes in Choice Experiments
Nathan Paul Kemper, Jennie Popp, Rodolfo M Nayga, Jr

Last modified: 28 March 2017


Choice experiments (CEs) are one of the most widely used methods of consumer valuation of both private and public goods. In a CE, participants are asked to consider a product that is defined by several attributes and a no-choice alternative (Hensher et al., 2015). Conventionally, the attributes and attribute levels are treated as relevant to the estimation of individual level utility (Hess and Hensher 2010); however, a recent research has focused on how people process attributes presented to them in choice experiments. Respondents may attend some attributes and ignore others during each choice task (Hess and Hensher 2010; Scarpa et al. 2013) and therefore respondents may not make the trade-offs between all the attributes as assumed. Overlooking respondents’ attendance to attributes in choice models can affect coefficient estimates, model fit, performance measures and welfare estimates (Hensher and Rose 2009; Scarpa et al. 2013). Accounting for patterns of attendance to attributes is essential for estimation of reliable results. Some previous studies have examined the strategies used by respondents in choice experiments (Balcombe et al. 2015; Bello and Abdulai 2016; Hess and Hensher 2010; Scarpa et al. 2013). While much research has been devoted to various methods for identifying attribute attendance (AA) and non-attendance (ANA), it is still unclear how best to account for individual attribute processing strategies in CEs.

Our study contributes to the literature by comparing three approaches to account for patterns of AA. The first two represent common approaches: 1) the self-reported or “stated” approach and 2) the inferred approach using methods proposed by Hess and Hensher (2010). In the third approach, we use Query Theory (Johnson et al., 2007) to examine the thought processes of individuals in our CE and then attempt to apply these data as a new approach to accounting for patterns of AA. We suggest that respondents go through a series of mental queries when confronted with choice tasks and that the content of these queries, as well as the order in which they are processed, influences choice behavior. We use a verbal reporting method called aspect-listing to gain information on the thought processes of individuals and to explore the effectiveness of Query Theory in helping us understand the information processing strategies of individuals in a CE. In each choice task, respondents were asked to: “please tell us what you were thinking of as you made this decision” and we then recorded the content and order of the responses to approximate the thought processes of respondents in each treatment. The aspect-listing is designed to capture the effect of the unobservable queries by documenting what they produce. More sophisticated measures exist but the aspect-listing method is easy to implement particularly in large sample market settings like the one used in this study.

Our study employs a between-sample design where respondents are randomly assigned to one of the two groups: 1) the query approach group or 2) the stated approach group. The experiment consists of two tasks. First, respondents participate in a CE where they make choices between poultry products differentiated by various labels regarding the presence of genetically modified ingredients, production location, and carbon footprint. Respondents in the query approach group are also given the aspect-listing task after each choice task.  The stated approach group is asked to respond to a question regarding their “attendance” or “ignoring” of each attribute after each choice task. Second, respondents are asked a series of survey questions covering basic demographics, policy questions and food preferences. We targeted primary grocery shoppers in US households. The data were collected through a national, web-based choice experiment survey built using the software package Sawtooth Software and collected by Survey Sampling International using their nationally representative consumer panel.

The design of the CE follows Scarpa et al. (2007), where the assignment of attributes and attribute levels to product alternatives is determined using a sequential Bayesian design. Respondents' preferences and WTPs are analyzed using a discrete choice framework consistent with random utility theory and Lancaster Consumer Theory. A Mixed (Random Parameters) Logit model (MXL) with correlated errors and variance enhancing error components are used to estimate preferences and WTP. We estimate various MXL models using data from the query approach treatment and the stated approach treatment. We then compare model structures, model fit, patterns of heterogeneity and WTP measures to assess the effectiveness of the query and stated approaches in accounting for AA. Preliminary results indicate that the Query Approach may help us better understand the attribute processing strategies of individuals in a CE.



Balcombe, K. G., Fraser, I. M. and McSorly, E. (2015). Visual attention and attribute attendance in multi-attribute choice experiments. Journal of Applied Econometrics 30:447–467.

Bello, M., and Abdulai, A. (2016). Impact of Ex-Ante Hypothetical Bias Mitigation Methods on Attribute Non-Attendance in Choice Experiments. American Journal of Agricultural Economics, aav098.

Hensher, D.A. and Rose, J.M. (2009). Simplifying choice through attribute preservation or non attendance: Implications for willingness to pay. Transportation Research Part E 45:583-590.

Hensher, D.A., Rose, J.M. and Greene, W.H. (2015). Applied choice analysis. Second edition. Cambridge University Press. Cambridge, UK.

Hess, S. and Hensher, D.A. (2010). Using conditioning on observed choices to retrieve individual-specific attribute processing strategies. Transportation Research Part B 44:781-790.

Johnson, E. J., Häubl, G., and Keinan, A. (2007). Aspects of endowment: a query theory of value construction. Journal of experimental psychology: Learning, memory, and cognition33(3), 461.

Scarpa, R., Campbell, D. and Hutchinson, W.G. (2007). Benefit estimates for landscape improvements: Sequential Bayesian design and respondents' rationality in a choice experiment. Land Economics 83:617-634.

Scarpa, R., Zanoli, R., Bruschi, V. and Naspetti, S. (2013). Inferred and stated attribute nonattendance in food choice experiments. American Journal of Agricultural Economics 95:165-180.

Conference registration is required in order to view papers.