MusicTimeR Documentation

Estimating Time with Different Music Playing

Description

Estimates of 45 seconds with different music playing

Format

A data frame with 60 observations on the following 6 variables.

MusicBg

Music playing in the background (no or yes)

Subject

Code for each subject (subj1 through subj20)

Sex

Subject's sex (f=female or m=male)

TimeGuess

Subject's time estimating 45 seconds (in seconds)

Music

Type of music (calm, control, or upbeat)

Accuracy

Absolute value of TimeGuess minus 45

Details

Participants were asked to judge when 45 seconds had passed in silence (control), while listening to an upbeat song (Metropolis, by David Guetta and Nicky Romero), and while listening to a calm song (Bach's Das Wohltemperierte Klavier, Prelude in C Major). The order in which the three conditions were experienced was randomized for each participant. Time until subject guessed 45 seconds had elapsed (TimeGuess) and the magnitude of the difference from 45 (Accuracy) were recorded.

Source

Data collected by Ksenia Vlasov at Oberlin College.