lnu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Towards reproducibility in recommender-systems research
Docear, Germany ; Konstanz University, Germany.
Linnaeus University, Faculty of Technology, Department of Media Technology. Docear, Germany.
Docear, Germany ; Otto-von-Guericke University, Germany.
Technische Universität Berlin, Germany.
Show others and affiliations
2016 (English)In: User modeling and user-adapted interaction, ISSN 0924-1868, E-ISSN 1573-1391, Vol. 26, no 1, 69-101 p.Article in journal (Refereed) Published
Abstract [en]

Numerous recommendation approaches are in use today. However, comparing their effectiveness is a challenging task because evaluation results are rarely reproducible. In this article, we examine the challenge of reproducibility in recommender-system research. We conduct experiments using Plista’s news recommender system, and Docear’s research-paper recommender system. The experiments show that there are large discrepancies in the effectiveness of identical recommendation approaches in only slightly different scenarios, as well as large discrepancies for slightly different approaches in identical scenarios. For example, in one news-recommendation scenario, the performance of a content-based filtering approach was twice as high as the second-best approach, while in another scenario the same content-based filtering approach was the worst performing approach. We found several determinants that may contribute to the large discrepancies observed in recommendation effectiveness. Determinants we examined include user characteristics (gender and age), datasets, weighting schemes, the time at which recommendations were shown, and user-model size. Some of the determinants have interdependencies. For instance, the optimal size of an algorithms’ user model depended on users’ age. Since minor variations in approaches and scenarios can lead to significant changes in a recommendation approach’s performance, ensuring reproducibility of experimental results is difficult. We discuss these findings and conclude that to ensure reproducibility, the recommender-system community needs to (1) survey other research fields and learn from them, (2) find a common understanding of reproducibility, (3) identify and understand the determinants that affect reproducibility, (4) conduct more comprehensive experiments, (5) modernize publication practices, (6) foster the development and use of recommendation frameworks, and (7) establish best-practice guidelines for recommender-systems research. © 2016, Springer Science+Business Media Dordrecht.

Place, publisher, year, edition, pages
2016. Vol. 26, no 1, 69-101 p.
Keyword [en]
Evaluation, Experimentation, Recommender systems, Reproducibility
National Category
Computer Systems
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
URN: urn:nbn:se:lnu:diva-56225DOI: 10.1007/s11257-016-9174-xScopus ID: 2-s2.0-84960395171OAI: oai:DiVA.org:lnu-56225DiVA: diva2:957205
Available from: 2016-09-01 Created: 2016-08-31 Last updated: 2017-01-11Bibliographically approved

Open Access in DiVA

No full text

Other links

Publisher's full textScopusFulltext (read only)

Search in DiVA

By author/editor
Breitinger, Corinna
By organisation
Department of Media Technology
In the same journal
User modeling and user-adapted interaction
Computer Systems

Search outside of DiVA

GoogleGoogle Scholar

Altmetric score

Total: 40 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf