DOI

10.17077/etd.s8rl-t0r0

Document Type

Thesis

Date of Degree

Spring 2019

Degree Name

MA (Master of Arts)

Degree In

Psychological and Quantitative Foundations

First Advisor

Dunbar, Stephen

First Committee Member

Welch, Catherine

Second Committee Member

Colbert, Amy

Abstract

Crowdsourcing has gained favor among many social scientists as a method for collecting data because this method is both time- and resource-efficient. The present study uses a within-subject test-retest design to evaluate the psychometric characteristics of crowdsource samples for developing and field testing measurement instruments. As evidenced by similar patterns of psychometric characteristics across time, strong test-retest reliability, and low failure rates of attention check items, the results of this study provide evidence that Amazon Mechanical Turk might represent a fruitful platform for field testing to support the development of a variety of measures. These findings, in turn, have significant implications for resource efficiency in the fields of educational and organizational measurement.

Keywords

Amazon MTurk, Crowdsourcing, Interest Inventory, Measurement, Online Panel

Pages

x, 69 pages

Bibliography

Includes bibliographical references (pages 52-56).

Copyright

Copyright © 2019 Emily Michelle Wetherell

Share

COinS