How to Robustly Combine Judgements from Crowd Assessors with AWARE

Marco Ferrante, Nicola Ferro, Maria Maistro

Research output: Contribution to journalConference articleResearchpeer-review

Abstract

We propose the Assessor-driven Weighted Averages for Retrieval Evaluation (AWARE) probabilistic framework, a novel methodology for dealing with multiple crowd assessors, who may be contradictory and/or noisy. By modeling relevance judgements and crowd assessors as sources of uncertainty, AWARE directly combines the performance measures computed on the ground-truth generated by the crowd assessors instead of adopting some classification technique to merge the labels produced by them. We propose several unsupervised estimators that instantiate the AWARE framework and we compare them with Majority Vote (MV) and Expectation Maximization (EM) showing that AWARE approaches improve both in correctly ranking systems and predicting their actual performance scores.

Original languageEnglish
JournalCEUR Workshop Proceedings
Volume2161
Pages (from-to)1DUMMY
ISSN1613-0073
Publication statusPublished - 1 Jan 2018
Externally publishedYes
Event26th Italian Symposium on Advanced Database Systems, SEBD 2018 - Castellaneta Marina (Taranto), Italy
Duration: 24 Jun 201827 Jun 2018

Conference

Conference26th Italian Symposium on Advanced Database Systems, SEBD 2018
Country/TerritoryItaly
CityCastellaneta Marina (Taranto)
Period24/06/201827/06/2018
SponsorCC ICT-SUD

Keywords

  • AWARE
  • Crowdsourcing
  • Unsupervised estimators

Cite this