Recent studies have empirically validated the data obtained from Amazon’s Mechanical Turk. Amazon’s Mechanical Turk workers behaved similarly not only in simple surveys but also in tasks used in cognitive behavioral experiments that employ multiple trials and require continuous attention to the task. The present study aimed to extend these findings to data from Japanese crowdsourcing pool in which participants have different ethnic backgrounds from Amazon’s Mechanical Turk workers. In five cognitive experiments, such as the Stroop and Flanker experiments, the reaction times and error rates of Japanese crowdsourcing workers and those of university students were compared and contrasted. The results were consistent with those of previous studies, although the students responded more quickly and poorly than the workers. These findings suggested that the Japanese crowdsourcing sample is another eligible participant pool in behavioral research; however, further investigations are needed to address issues of qualitative differences between student and worker samples.