lnu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Multi-criteria Ranking Based on Joint Distributions: A Tool to Support Decision Making
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM). (DISA;DSIQ;DISTA)ORCID iD: 0000-0002-3906-7611
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM). (DISA;DSIQ;DISTA)ORCID iD: 0000-0003-1173-5187
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM). (DISA;DSIQ;DISTA)ORCID iD: 0000-0002-7565-3714
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM). (DISA;DSIQ;DISTA)ORCID iD: 0000-0002-0835-823X
2019 (English)In: Perspectives in Business Informatics Research.BIR 2019: 18th International Conference on Business Informatics Research / [ed] Pańkowska M., Sandkuhl K, Springer, 2019, p. 74-88Conference paper, Published paper (Refereed)
Abstract [en]

Sound assessment and ranking of alternatives are fundamental to effective decision making. Creating an overall ranking is not trivial if there are multiple criteria, and none of the alternatives is the best according to all criteria. To address this challenge, we propose an approach that aggregates criteria scores based on their joint (probability) distribution and obtains the ranking as a weighted product of these scores. We evaluate our approach in a real-world use case based on a funding allocation problem and compare it with the traditional weighted sum aggregation model. The results show that the approaches assign similar ranks, while our approach is more interpretable and sensitive.

Place, publisher, year, edition, pages
Springer, 2019. p. 74-88
Series
Lecture Notes in Business Information Processing, ISSN 1865-1348, E-ISSN 1865-1356 ; 365
Keywords [en]
Aggregation, Management by objectives, Ranking
National Category
Software Engineering
Research subject
Computer and Information Sciences Computer Science, Computer Science; Computer Science, Software Technology
Identifiers
URN: urn:nbn:se:lnu:diva-89171DOI: 10.1007/978-3-030-31143-8_6Scopus ID: 2-s2.0-85075255852ISBN: 978-3-030-31142-1 (print)ISBN: 978-3-030-31143-8 (electronic)OAI: oai:DiVA.org:lnu-89171DiVA, id: diva2:1352122
Conference
18th International Conference, BIR 2019, Katowice, Poland, September 23-25, 2019
Funder
Knowledge Foundation, 20150088Available from: 2019-09-17 Created: 2019-09-17 Last updated: 2024-08-29Bibliographically approved
In thesis
1. Aggregation as Unsupervised Learning in Software Engineering and Beyond
Open this publication in new window or tab >>Aggregation as Unsupervised Learning in Software Engineering and Beyond
2021 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

Ranking alternatives is fundamental to effective decision making. However, creating an overall ranking is difficult if there are multiple criteria, and no single alternative performs best across all criteria. Software engineering is no exception.

Software quality is usually decomposed hierarchically into characteristics, and their quality can be assessed by various direct and indirect metrics. Although such quality models provide a basic understanding of what data to collect and which metrics to use, it is not clear how the metrics should be combined to assess the overall quality. Due to different approaches for aggregation of metrics, the same quality model and the same metrics for assessing the same software artifact could still lead to different assessment results and even to different interpretations.

The proposed aggregation approach in this thesis is well-defined, interpretable, and applicable under realistic conditions. This approach can turn the quality- model- and metric-based assessment of (software) quality into a reliable and reproducible process. We express quality as the probability of detecting something with equal or worse quality, based on all software artifacts observed; good and bad quality is expressed in terms of lower and higher probabilities. 

We validated our approach theoretically and empirically. We conducted empirical studies on Bug prediction, Maintainability assessment, and Information Quality.

We used Software Visualization to analyze the usability of aggregation for analyzing multivariate data in general and the effect of different alternative aggregation approaches, i.e., we designed and implemented an exploratory multivariate data visualization tool.

Finally, we applied our approach to Multi-criteria Ranking to evaluate its transferability to other domains. We evaluated it on a real-world decision-making problem for assessment and ranking of alternatives. Moreover, we applied our approach to the context of Machine Learning. We created a benchmark from a collection of regression problems, and evaluated how well the aggregation output agrees with a ground truth, and how well it represents the properties of the input variables.

The results showed that our approach is not only theoretically sound, it is also accurate, sensitive, identifies anomalies, scales in performance, and can support multi-criteria decision making. Furthermore, our approach is transferable to other domains that require aggregation in hierarchically structured models, and it can be used as an agnostic unsupervised predictor in the absence of a ground truth.

Place, publisher, year, edition, pages
Växjö: Linnaeus University Press, 2021. p. 51
Series
Linnaeus University Dissertations ; 430
Keywords
quality assessment, quantitative methods, aggregation, multi-criteria decision making, unsupervised machine learning
National Category
Computer Sciences
Research subject
Computer and Information Sciences Computer Science
Identifiers
urn:nbn:se:lnu:diva-108115 (URN)9789189460409 (ISBN)9789189460416 (ISBN)
Public defence
2021-12-17, Weber, building K, Växjö, 13:00 (English)
Opponent
Supervisors
Available from: 2021-11-24 Created: 2021-11-19 Last updated: 2024-03-06Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Ulan, MariaEricsson, MorganLöwe, WelfWingkvist, Anna

Search in DiVA

By author/editor
Ulan, MariaEricsson, MorganLöwe, WelfWingkvist, Anna
By organisation
Department of computer science and media technology (CM)
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 550 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf