Evaluating Crowdsourcing through Amazon Mechanical Turk as a Technique for Conducting Music Perception Experiments

نویسندگان

  • Jieun Oh
  • Ge Wang
چکیده

Online crowdsourcing marketplaces, such as the Amazon Mechanical Turk, provide an environment for cost-effective crowdsourcing on a massive scale, leveraging human intelligence, expertise, and judgment. While the Mechanical Turk is typically used by businesses to clean data, categorize items, and moderate content, the scientific community, too, has begun experimenting with it to conduct academic research. In this paper, we evaluate crowdsourcing as a technique for conducting music perception experiments by first describing how principles of experimental design can be implemented on the Mechanical Turk. Then, we discuss the pros and cons of online crowdsourcing with respect to subject demography, answer quality, recruitment cost, and ethical concerns. Finally, we address audio-specific factors relevant to researchers in the field of music perception and cognition. The goal of this review is to offer practical guidelines for designing experiments that best leverage the benefits and overcome the challenges of employing crowdsourcing as a research methodology.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Exploring Mental Lexicon in an Efficient and Economic Way: Crowdsourcing Method for Linguistic Experiments

Mental lexicon plays a central role in human language competence and inspires the creation of new lexical resources. The traditional linguistic experiment methodwhich is used to exploremental lexicon has some disadvantages. Crowdsourcing has become a promising method to conduct linguistic experiments which enables us to explore mental lexicon in an efficient and economic way. We focus on the fe...

متن کامل

Can we get rid of TREC assessors? Using Mechanical Turk for relevance assessment

Recently, Amazon Mechanical Turk has gained a lot of attention as a tool for conducting different kinds of relevance evaluations. In this paper we show a series of experiments on TREC data, evaluate the outcome, and discuss the results. Our position, supported by these preliminary experimental results, is that crowdsourcing is a viable alternative for relevance assessment.

متن کامل

How to use Mechanical Turk for Cognitive Science Research

In this half-day tutorial we will describe a new tool that has emerged in the last 5 years for conducting online behavioral research: crowdsourcing platforms. The term crowdsourcing has its origin in an article by Howe (2006), who defined it as a job outsourced to an undefined group of people in the form of an open call. One of the main benefits of these platforms to behavioral researchers is t...

متن کامل

Web-Based VR Experiments Powered by the Crowd

We build on the increasing availability of Virtual Reality (VR) devices and Web technologies to conduct behavioral experiments in VR using crowdsourcing techniques. A new recruiting and validation method allows us to create a panel of eligible experiment participants recruited from Amazon Mechanical Turk. Using this panel, we ran three different crowdsourced VR experiments, each reproducing one...

متن کامل

Emotion in Music Task at MediaEval 2014

Emotional expression is an important property of music. Its emotional characteristics are thus especially natural for music indexing and recommendation. The Emotion in Music task addresses the task of automatic music emotion prediction and is held for the second year in 2014. As compared to previous year, we modified the task by offering a new feature development subtask, and releasing a new ev...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012