
Frontiers in Organizational Psychology, Год журнала: 2025, Номер 3
Опубликована: Апрель 9, 2025
Introduction To create training data for AI systems, it is necessary to manually assign correct labels a large number of objects; this task often performed by crowdsourcing. This usually divided into certain smaller and more manageable segments, workers work on them one after the other. In study, assuming above task, we investigated whether deliverable evaluation feedback provision additional rewards contribute improvement workers' motivation, that is, persistence tasks performance. Method We conducted user experiment real crowdsourcing service platform. provided first second round tasks, which ask input flower species. developed an experimental system assessed products first-round worker presented results worker. Six hundred forty-five participated in experiment. They were high low performing groups according their scores (correct answer ratio). The performance continuation ratio under group with without compared. Results found presentation evaluations increased rate high-quality workers, but did not increase rate) either type providing reduced rate, amount reduction was larger low-quality than workers. However, largely worker's Although statistically significant, highest those who shown both rewards. Discussion It positively affected motivation previous studies. inconsistent our study. One possible reason studies have examined future engagements different whereas study successive tackles almost same task. conclusion, better offer when quality deliverables priority, give only quantity priority.
Язык: Английский