lnu.sePublications
Planned maintenance
A system upgrade is planned for 10/12-2024, at 12:00-13:00. During this time DiVA will be unavailable.
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Applying deep learning to reduce large adaptation spaces of self-adaptive systems with multiple types of goals
Katholieke Universiteit Leuven, Belgium.
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM). Katholieke Universiteit Leuven, Belgium.ORCID iD: 0000-0002-1162-0817
Katholieke Universiteit Leuven, Belgium.
Katholieke Universiteit Leuven, Belgium.
Show others and affiliations
2020 (English)In: Proceedings - 2020 IEEE/ACM 15th International Symposium on Software Engineering for Adaptive and Self-Managing Systems, SEAMS 2020, ACM Publications, 2020, p. 20-30Conference paper, Published paper (Refereed)
Abstract [en]

When a self-adaptive system needs to adapt, it has to analyze the possible options for adaptation, i.e., the adaptation space. For systems with large adaptation spaces, this analysis process can be resource- and time-consuming. One approach to tackle this problem is using machine learning techniques to reduce the adaptation space to only the relevant adaptation options. However, existing approaches only handle threshold goals, while practical systems often need to address also optimization goals. To tackle this limitation, we propose a two-stage learning approach called Deep Learning for Adaptation Space Reduction (DLASeR). DLASeR applies a deep learner first to reduce the adaptation space for the threshold goals and then ranks these options for the optimization goal. A benefit of deep learning is that it does not require feature engineering. Results on two instances of the DeltaIoT artifact (with different sizes of adaptation space) show that DLASeR outperforms a state-of-the-art approach for settings with only threshold goals. The results for settings with both threshold goals and an optimization goal show that DLASeR is effective with a negligible effect on the realization of the adaptation goals. Finally, we observe no noteworthy effect on the effectiveness of DLASeR for larger sizes of adaptation spaces. © 2020 ACM.

Place, publisher, year, edition, pages
ACM Publications, 2020. p. 20-30
Keywords [en]
adaptation space, deep learning, self-adaptation, Adaptive systems, Learning systems, Software engineering, Feature engineerings, Learning approach, Machine learning techniques, Optimization goals, Practical systems, Self-adaptive system, Space reductions, State-of-the-art approach
National Category
Computer Sciences
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
URN: urn:nbn:se:lnu:diva-108446DOI: 10.1145/3387939.3391605Scopus ID: 2-s2.0-85091531697ISBN: 9781450379625 (print)OAI: oai:DiVA.org:lnu-108446DiVA, id: diva2:1618004
Conference
IEEE/ACM 15th International Symposium on Software Engineering for Adaptive and Self-Managing Systems, June 2020
Available from: 2021-12-08 Created: 2021-12-08 Last updated: 2021-12-08Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Weyns, Danny

Search in DiVA

By author/editor
Weyns, Danny
By organisation
Department of computer science and media technology (CM)
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 10 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf