000 | 01769pab a2200205 454500 | ||
---|---|---|---|
008 | 140923b0 xxu||||| |||| 00| 0 eng d | ||
040 |
_cWelingkar Institute of Management Development & Research, Mumbai _aWelingkar Institute of Management Development & Research, Mumbai |
||
041 | _aENG | ||
082 |
_a _bSwa |
||
100 | _aSwain Scott D | ||
245 | _aAssessing Three Sources of Misresponse to Reversed Likert Items | ||
250 | _a1 | ||
260 |
_a _bFeb 2008 _c0 |
||
300 | _a116-131 Pp. | ||
490 | _vXLV | ||
520 | _aData collected through multi-item Likert scales that contain reversed items often exhibit problems, such as unexpected factor structures and diminished scale reliabilities. These problems arise when respondents select responses on the same side of the scale neutral point for both reversed and nonreversed items, a phenomenon the authors call misresponse. Across four experiments and an exploratory study using published data, the authors find that misresponse to reversed Likert items averaged approximately 20%, twice the level identified as problematic in previous simulation studies. Counter to prevailing thought, the patterns of misresponse and response latency across manipulated items could not be attributed to respondent inattention or acquiescence. Instead, the pattern supports an item verification difficulty explanation, which holds that task complexity, and thus misresponse and response latency, increases with the number of cognitive operations required for a respondent to compare a scale item with his or her belief. The observed results are well explained by the constituent comparison model. | ||
650 | _aMeasurement, Likert Scales, | ||
856 | _uhttp://192.168.6.13/libsuite/mm_files/Articles/AR9579.pdf | ||
906 | _a28330 | ||
999 |
_c29456 _d29456 |