Skip to main content
Top

2020 | OriginalPaper | Chapter

s-AWARE: Supervised Measure-Based Methods for Crowd-Assessors Combination

Authors : Marco Ferrante, Nicola Ferro, Luca Piazzon

Published in: Experimental IR Meets Multilinguality, Multimodality, and Interaction

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Ground-truth creation is one of the most demanding activities in terms of time, effort, and resources needed for creating an experimental collection. For this reason, crowdsourcing has emerged as a viable option to reduce the costs and time invested in it.
An effective assessor merging methodology is crucial to guarantee a good ground-truth quality. The classical approach involve the aggregation of labels from multiple assessors using some voting and/or classification methods. Recently, Assessor-driven Weighted Averages for Retrieval Evaluation (AWARE) has been proposed as an unsupervised alternative, which optimizes the final evaluation measure, rather than the labels, computed from multiple judgments.
In this paper, we propose s-AWARE, a supervised version of AWARE. We tested s-AWARE against a range of state-of-the-art methods and the unsupervised AWARE on several TREC collections. We analysed how the performance of these methods changes by increasing assessors’ judgement sparsity, highlighting that s-AWARE is an effective approach in a real scenario.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
The original AWARE methodology considered additional ways to quantify “closeness”, i.e. Frobenious norm, Kullback-Leibler Divergence (KLD), and AP Correlation (APC). Here, we focus on the two approaches which produced the best and most stable results across different configurations.
 
Literature
1.
go back to reference Allan, J., Harman, D.K., Kanoulas, E., Li, D., Van Gysel, C., Voorhees, E.M.: TREC 2017 common core track overview. In: Voorhees, E.M., Ellis, A. (eds.) Proceedings of The Twenty-Sixth Text REtrieval Conference (TREC 2017). National Institute of Standards and Technology (NIST), Special Publication 500-324, Washington, USA (2018) Allan, J., Harman, D.K., Kanoulas, E., Li, D., Van Gysel, C., Voorhees, E.M.: TREC 2017 common core track overview. In: Voorhees, E.M., Ellis, A. (eds.) Proceedings of The Twenty-Sixth Text REtrieval Conference (TREC 2017). National Institute of Standards and Technology (NIST), Special Publication 500-324, Washington, USA (2018)
2.
go back to reference Alonso, O.: The Practice of Crowdsourcing. Morgan & Claypool Publishers, San Rafael (2019) CrossRef Alonso, O.: The Practice of Crowdsourcing. Morgan & Claypool Publishers, San Rafael (2019) CrossRef
6.
8.
go back to reference Inel, O., et al.: Studying topical relevance with evidence-based crowdsourcing. In: Cuzzocrea, A., et al. (eds.) Proceedings of 27th International Conference on Information and Knowledge Management (CIKM 2018), pp. 1253–1262. ACM Press, New York (2018) Inel, O., et al.: Studying topical relevance with evidence-based crowdsourcing. In: Cuzzocrea, A., et al. (eds.) Proceedings of 27th International Conference on Information and Knowledge Management (CIKM 2018), pp. 1253–1262. ACM Press, New York (2018)
9.
go back to reference Nellapati, R., Peerreddy, S., Singhal, P.: Skierarchy: Extending the power of crowdsourcing using a hierarchy of domain experts, crowd and machine learning. In: Proceedings of the TREC 2012 Crowdsourcing Track, pp. 1–11 (2012) Nellapati, R., Peerreddy, S., Singhal, P.: Skierarchy: Extending the power of crowdsourcing using a hierarchy of domain experts, crowd and machine learning. In: Proceedings of the TREC 2012 Crowdsourcing Track, pp. 1–11 (2012)
10.
go back to reference Sanderson, M.: Test collection based evaluation of information retrieval systems. Found. Trends Inf. Retr. (FnTIR) 4(4), 247–375 (2010)CrossRef Sanderson, M.: Test collection based evaluation of information retrieval systems. Found. Trends Inf. Retr. (FnTIR) 4(4), 247–375 (2010)CrossRef
11.
go back to reference Smucker, M.D., Kazai, G., Lease, M.: Overview of the TREC 2012 crowdsourcing track. In: Voorhees, E.M., Buckland, L.P. (eds.) Proceedings of The Twenty-First Text REtrieval Conference (TREC 2012). National Institute of Standards and Technology (NIST), Special Publication 500-298, Washington, USA (2013) Smucker, M.D., Kazai, G., Lease, M.: Overview of the TREC 2012 crowdsourcing track. In: Voorhees, E.M., Buckland, L.P. (eds.) Proceedings of The Twenty-First Text REtrieval Conference (TREC 2012). National Institute of Standards and Technology (NIST), Special Publication 500-298, Washington, USA (2013)
12.
go back to reference Tang, W., Lease, M.: Semi-supervised consensus labeling for crowdsourcing. In: Proceedings of the SIGIR 2011 Workshop on Crowdsourcing for Information Retrieval (CIR), pp. 36–41. ACM, New York (2011). ISBN 9781450325387 Tang, W., Lease, M.: Semi-supervised consensus labeling for crowdsourcing. In: Proceedings of the SIGIR 2011 Workshop on Crowdsourcing for Information Retrieval (CIR), pp. 36–41. ACM, New York (2011). ISBN 9781450325387
15.
go back to reference Voorhees, E.M.: Overview of the TREC 2004 robust track. In: Voorhees, E.M., Buckland, L.P. (eds.) Proceedings of The Thirteenth Text REtrieval Conference (TREC 2004). National Institute of Standards and Technology (NIST), Special Publication 500-261, Washington, USA (2004) Voorhees, E.M.: Overview of the TREC 2004 robust track. In: Voorhees, E.M., Buckland, L.P. (eds.) Proceedings of The Thirteenth Text REtrieval Conference (TREC 2004). National Institute of Standards and Technology (NIST), Special Publication 500-261, Washington, USA (2004)
16.
go back to reference Voorhees, E.M., Harman, D.K.: Overview of the eigth Text REtrieval Conference (TREC-8). In: Voorhees, E.M., Harman, D.K. (eds.) Proceedings of The Eighth Text REtrieval Conference (TREC-8), pp. 1–24. National Institute of Standards and Technology (NIST), Special Publication 500-246, Washington, USA (1999) Voorhees, E.M., Harman, D.K.: Overview of the eigth Text REtrieval Conference (TREC-8). In: Voorhees, E.M., Harman, D.K. (eds.) Proceedings of The Eighth Text REtrieval Conference (TREC-8), pp. 1–24. National Institute of Standards and Technology (NIST), Special Publication 500-246, Washington, USA (1999)
17.
go back to reference Whiting, S., Perez, J., Zuccon, G., Leelanupab, T., Jose, J.: University of Glasgow (qirdcsuog) at TREC crowdsourcing 2001: TurkRank - network based worker ranking in crowdsourcing. In: Proceedings of The Twentieth Text REtrieval Conference, TREC 2011, Gaithersburg, Maryland, USA, 15–18 November 2011, pp. 1–7, January 2011 Whiting, S., Perez, J., Zuccon, G., Leelanupab, T., Jose, J.: University of Glasgow (qirdcsuog) at TREC crowdsourcing 2001: TurkRank - network based worker ranking in crowdsourcing. In: Proceedings of The Twentieth Text REtrieval Conference, TREC 2011, Gaithersburg, Maryland, USA, 15–18 November 2011, pp. 1–7, January 2011
18.
go back to reference Yilmaz, E., Aslam, J.A., Robertson, S.E.: A new rank correlation coefficient for information retrieval. In: Chua, T.S., Leong, M.K., Oard, D.W., Sebastiani, F. (eds.) Proceedings of 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2008), pp. 587–594. ACM Press, New York, USA (2008) Yilmaz, E., Aslam, J.A., Robertson, S.E.: A new rank correlation coefficient for information retrieval. In: Chua, T.S., Leong, M.K., Oard, D.W., Sebastiani, F. (eds.) Proceedings of 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2008), pp. 587–594. ACM Press, New York, USA (2008)
Metadata
Title
s-AWARE: Supervised Measure-Based Methods for Crowd-Assessors Combination
Authors
Marco Ferrante
Nicola Ferro
Luca Piazzon
Copyright Year
2020
DOI
https://doi.org/10.1007/978-3-030-58219-7_2

Premium Partner