lnu.sePublications
Change search
Link to record
Permanent link

Direct link
BETA
Danylenko, Antonina
Alternative names
Publications (10 of 12) Show all publications
Danylenko, A. (2015). Decision Algebra: A General Approach to Learning and Using Classifiers. (Doctoral dissertation). Växjö: Linnaeus University Press
Open this publication in new window or tab >>Decision Algebra: A General Approach to Learning and Using Classifiers
2015 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]

Processing decision information is a vital part of Computer Science fields in which pattern recognition problems arise. Decision information can be generalized as alternative decisions (or classes), attributes and attribute values, which are the basis for classification. Different classification approaches exist, such as decision trees, decision tables and Naïve Bayesian classifiers, which capture and manipulate decision information in order to construct a specific decision model (or classifier). These approaches are often tightly coupled to learning strategies, special data structures and the special characteristics of the decision information captured, etc. The approaches are also connected to the way of how certain problems are addressed, e.g., memory consumption, low accuracy, etc. This situation causes problems for a simple choice, comparison, combination and manipulation of different decision models learned over the same or different samples of decision information. The choice and comparison of decision models are not merely the choice of a model with a higher prediction accuracy and a comparison of prediction accuracies, respectively. We also need to take into account that a decision model, when used in a certain application, often has an impact on the application's performance. Often, the combination and manipulation of different decision models are implementation- or application-specific, thus, lacking the generality that leads to the construction of decision models with combined or modified decision information. They also become difficult to transfer from one application domain to another. In order to unify different approaches, we define Decision Algebra, a theoretical framework that presents decision models as higher order decision functions that abstract from their implementation details. Decision Algebra defines the operations necessary to decide, combine, approximate, and manipulate decision functions along with operation signatures and general algebraic laws. Due to its algebraic completeness (i.e., a complete algebraic semantics of operations and its implementation efficiency), defining and developing decision models is simple as such instances require implementing just one core operation based on which other operations can be derived. Another advantage of Decision Algebra is composability: it allows for combination of decision models constructed using different approaches. The accuracy and learning convergence properties of the combined model can be proven regardless of the actual approach. In addition, the applications that process decision information can be defined using Decision Algebra regardless of the different classification approaches. For example, we use Decision Algebra in a context-aware composition domain, where we showed that context-aware applications improve performance when using Decision Algebra. In addition, we suggest an approach to integrate this context-aware component into legacy applications.

Place, publisher, year, edition, pages
Växjö: Linnaeus University Press, 2015. p. 192
Series
Linnaeus University Dissertations ; 209
Keywords
classification, decision model, classifier, Decision Algebra, decision function
National Category
Computer Sciences
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-43238 (URN)978-91-87925-47-4 (ISBN)
Public defence
2015-02-27, Weber, Hus K, Växjö, 13:00 (English)
Opponent
Supervisors
Available from: 2015-05-17 Created: 2015-05-16 Last updated: 2019-09-16Bibliographically approved
Danylenko, A., Lundberg, J. & Löwe, W. (2014). Decisions: Algebra, Implementation, and First Experiments. Journal of universal computer science (Online), 20(9), 1174-1231
Open this publication in new window or tab >>Decisions: Algebra, Implementation, and First Experiments
2014 (English)In: Journal of universal computer science (Online), ISSN 0948-695X, E-ISSN 0948-6968, Vol. 20, no 9, p. 1174-1231Article in journal (Refereed) Published
Abstract [en]

Classification is a constitutive part in many different fields of Computer Science. There exist several approaches that capture and manipulate classification information in order to construct a specific classification model. These approaches are often tightly coupled to certain learning strategies, special data structures for capturing the models, and to how common problems, e.g. fragmentation, replication and model overfitting, are addressed. In order to unify these different classification approaches, we define a Decision Algebra which defines models for classification as higher order decision functions abstracting from their implementations using decision trees (or similar), decision rules, decision tables, etc. Decision Algebra defines operations for learning, applying, storing, merging, approximating, and manipulating models for classification, along with some general algebraic laws regardless of the implementation used. The Decision Algebra abstraction has several advantages. First, several useful Decision Algebra operations (e.g., learning and deciding) can be derived based on the implementation of a few core operations (including merging and approximating). Second, applications using classification can be defined regardless of the different approaches. Third, certain properties of Decision Algebra operations can be proved regardless of the actual implementation. For instance, we show that the merger of a series of probably accurate decision functions is even more accurate, which can be exploited for efficient and general online learning. As a proof of the Decision Algebra concept, we compare decision trees with decision graphs, an efficient implementation of the Decision Algebra core operations, which capture classification models in a non-redundant way. Compared to classical decision tree implementations, decision graphs are 20% faster in learning and classification without accuracy loss and reduce memory consumption by 44%. This is the result of experiments on a number of standard benchmark data sets comparing accuracy, access time, and size of decision graphs and trees as constructed by the standard C4.5 algorithm. Finally, in order to test our hypothesis about increased accuracy when merging decision functions, we merged a series of decision graphs constructed over the data sets. The result shows that on each step the accuracy of the merged decision graph increases with the final accuracy growth of up to 16%.

Keywords
decision algebra, decision function, decision graph, decision tree, classification
National Category
Computer Sciences
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-38560 (URN)000344576300003 ()2-s2.0-84908464423 (Scopus ID)
Available from: 2014-12-09 Created: 2014-12-09 Last updated: 2018-05-17Bibliographically approved
Danylenko, A. & Löwe, W. (2014). Merging Classifiers of Different Classification Approaches. In: 2014 IEEE International Conference on Data Mining Workshop (ICDMW): . Paper presented at IEEE International Conference on Data Mining Workshops, (ICDM) Workshops, Shenzhen, China, December 14, 2014 (pp. 706-715). IEEE Press
Open this publication in new window or tab >>Merging Classifiers of Different Classification Approaches
2014 (English)In: 2014 IEEE International Conference on Data Mining Workshop (ICDMW), IEEE Press, 2014, p. 706-715Conference paper, Published paper (Refereed)
Abstract [en]

Classification approaches, e.g. Decision trees or Naive Bayesian classifiers, are often tightly coupled to learning strategies, special data structures, the type of information captured, and to how common problems, e.g. Over fitting, are addressed. This prevents a simple combination of classifiers of differentclassification approaches learned over different data sets. Many different methods of combiningclassification models have been proposed. However, most of them are based on a combination of the actual result of classification rather then producing a new, possibly more accurate, classifier capturing the combined classification information. In this paper we propose a new general approach to combiningdifferent classification models based on a concept of Decision Algebra which provides a unified formalization of classification approaches as higher order decision functions. It defines a general combining operation, referred to as merge operation, abstracting from implementation details of differentclassifiers. We show that the combination of a series of probably accurate decision functions (regardless of the actual implementation) is even more accurate. This can be exploited, e.g., For distributed learning and for efficient general online learning. We support our results by combining a series of decision graphs and Naive Bayesian classifiers learned from random samples of the data sets. The result shows that on each step the accuracy of the combined classifier increases, with a total accuracy growth of up to 17%.

Place, publisher, year, edition, pages
IEEE Press, 2014
National Category
Computer Sciences
Identifiers
urn:nbn:se:lnu:diva-42471 (URN)10.1109/ICDMW.2014.64 (DOI)000389255100096 ()2-s2.0-84936871579 (Scopus ID)978-1-4799-4275-6 (ISBN)
Conference
IEEE International Conference on Data Mining Workshops, (ICDM) Workshops, Shenzhen, China, December 14, 2014
Available from: 2015-04-15 Created: 2015-04-15 Last updated: 2018-02-16Bibliographically approved
Danylenko, A. & Löwe, W. (2012). Adaptation of Legacy Codes to Context-Aware Composition Using Aspect-Oriented Programming. Paper presented at Proceedings 11th International Conference, SC 2012, Prague, Czech Republic, May 31 – June 1, 2012.. Lecture Notes in Computer Science, 7306, 68-85
Open this publication in new window or tab >>Adaptation of Legacy Codes to Context-Aware Composition Using Aspect-Oriented Programming
2012 (English)In: Lecture Notes in Computer Science, ISSN 0302-9743, E-ISSN 1611-3349, Vol. 7306, p. 68-85Article in journal (Refereed) Published
Abstract [en]

The context-aware composition approach (CAC) has shown to improve the performance of object-oriented applications on modern multi-core hardware by selecting between different (sequential and parallel) component variants in different (call and hardware) contexts. However, introducing CAC in legacy applications can be time-consuming and requires quite some effort for changing and adapting the existing code.We observe that CAC-concerns, like offline component variant profiling and runtime selection of the champion variant, can be separated from the legacy application code. We suggest separating and reusing these CAC concerns when introducing CAC to different legacy applications.

For automating this process, we propose an approach based on Aspect-Oriented Programming (AOP) and Reflective Programming. It shows that manual adaptation to CAC requires more programming than the AOP-based approach; almost three times in our experiments. Moreover, the AOP-based approach speeds up the execution time of the legacy code, in our experiments by factors of up to 2.3 and 3.4 on multi-core machines with two and eight cores, respectively. The AOP based approach only introduces a small runtime overhead compared to the manually optimized CAC approach. For different problems, this overhead is about 2-9% of the manual adaptation approach. These results suggest that AOP-based adaptation can effectively adapt legacy applications to CAC which makes them running efficiently even on multi-core machines.

Place, publisher, year, edition, pages
Springer-Verlag Berlin Heidelberg 2012, 2012
Keywords
Context-Aware Composition, Autotuning, Aspect-Oriented Programming
National Category
Computer Sciences
Research subject
Computer Science, Software Technology
Identifiers
urn:nbn:se:lnu:diva-19232 (URN)10.1007/978-3-642-30564-1 (DOI)978-3-642-30563-4 (ISBN)
Conference
Proceedings 11th International Conference, SC 2012, Prague, Czech Republic, May 31 – June 1, 2012.
Projects
Context-Aware Composition of Parallel Components
Available from: 2012-08-30 Created: 2012-06-01 Last updated: 2018-01-12Bibliographically approved
Danylenko, A. & Löwe, W. (2012). Context-Aware Recommender Systems for Non-functional Requirements. In: Third International Workshop on Recommendation Systems for Software Engineering (RSSE 2012). Paper presented at The Third Internationa Workshop on Recommendation Systems for Software Engineering, 4 June, 2012, Zurich (pp. 80-84).
Open this publication in new window or tab >>Context-Aware Recommender Systems for Non-functional Requirements
2012 (English)In: Third International Workshop on Recommendation Systems for Software Engineering (RSSE 2012), 2012, p. 80-84Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

For large software projects, system designers have to adhere to a significant number of functional and non-functional requirements, which makes software development a complex engineering task. If these requirements change during the development process, complexity even increases. In this paper, we suggest recommendation systems based on context-aware composition to enable a system designer to postpone and automate decisions regarding efficiency non-functional requirements, such as performance, and focus on the design of the core functionality of the system instead.

Context-aware composition suggests the optimal component variants of a system for different static contexts (e.g., software and hardware environment) or even different dynamic contexts (e.g., actual parameters and resource utilization). Thus, an efficiency non-functional requirement can be automatically optimized statically or dynamically by providing possible component variants. Such a recommender system reduces time and effort spent on manually developing optimal applications that adapts to different (static or dynamic) contexts and even changes thereof.

Keywords
context-aware recommender systems; nonfunctional requirements; context-aware composition
National Category
Computer Sciences
Identifiers
urn:nbn:se:lnu:diva-19235 (URN)10.1109/RSSE.2012.6233417 (DOI)2-s2.0-84864706752 (Scopus ID)
Conference
The Third Internationa Workshop on Recommendation Systems for Software Engineering, 4 June, 2012, Zurich
Available from: 2012-08-30 Created: 2012-06-01 Last updated: 2018-01-12Bibliographically approved
Danylenko, A., Zimmermann, W. & Löwe, W. (2012). Decision Algebra: Parameterized Specification of Decision Models. In: Narsio Martí-Oliet, Miguel Palomino (Ed.), WADT 2012: 21st International Workshop on Algebraic Development Techniques, 7-10 June, 2012, Salamanca, Spain ; Technical report TR-08/12. Paper presented at 21st Workshop on Algebraic Development Techniques, 7-10 June, 2012, Salamanca, Spain (pp. 40-43).
Open this publication in new window or tab >>Decision Algebra: Parameterized Specification of Decision Models
2012 (English)In: WADT 2012: 21st International Workshop on Algebraic Development Techniques, 7-10 June, 2012, Salamanca, Spain ; Technical report TR-08/12 / [ed] Narsio Martí-Oliet, Miguel Palomino, 2012, p. 40-43Conference paper, Oral presentation with published abstract (Refereed)
National Category
Computer Sciences
Identifiers
urn:nbn:se:lnu:diva-19236 (URN)
Conference
21st Workshop on Algebraic Development Techniques, 7-10 June, 2012, Salamanca, Spain
Available from: 2012-08-30 Created: 2012-06-01 Last updated: 2018-01-12Bibliographically approved
Danylenko, A., Kessler, C. & Löwe, W. (2011). Comparing Machine Learning Approaches for Context-Aware Composition. In: Sven Apel, Ethan Jackson (Ed.), Software Composition: 10th International Conference, SC 2011, Zurich, Switzerland, June 30 - July 1, 2011, Proceedings. Paper presented at International Conference on Software Composition 2012 (pp. 18-33). Paper presented at International Conference on Software Composition 2012. Berlin: Springer, 6708
Open this publication in new window or tab >>Comparing Machine Learning Approaches for Context-Aware Composition
2011 (English)In: Software Composition: 10th International Conference, SC 2011, Zurich, Switzerland, June 30 - July 1, 2011, Proceedings / [ed] Sven Apel, Ethan Jackson, Berlin: Springer, 2011, Vol. 6708, p. 18-33Chapter in book (Refereed)
Abstract [en]

Context-Aware Composition allows to automatically select optimal variants of algorithms, data-structures, and schedules at runtime using generalized dynamic Dispatch Tables. These tables grow exponentially with the number of significant context attributes. To make Context-Aware Composition scale, we suggest four alternative implementations to Dispatch Tables, all well-known in the field of machine learning: Decision Trees, Decision Diagrams, Naive Bayes and Support Vector Machines classifiers. We assess their decision overhead and memory consumption theoretically and practically in a number of experiments on different hardware platforms. Decision Diagrams turn out to be more compact compared to Dispatch Tables, almost as accurate, and faster in decision making. Using Decision Diagrams in Context-Aware Composition leads to a better scalability, i.e., Context-Aware Composition can be applied at more program points and regard more context attributes than before.

Place, publisher, year, edition, pages
Berlin: Springer, 2011
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; Volume 6708
Keywords
Context-Aware Composition – Autotuning – Machine Learning
National Category
Computer Systems
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-13442 (URN)10.1007/978-3-642-22045-6_2 (DOI)2-s2.0-79960128332 (Scopus ID)978-3-642-22044-9 (ISBN)978-3-642-22045-6 (ISBN)
Conference
International Conference on Software Composition 2012
Available from: 2011-07-05 Created: 2011-07-05 Last updated: 2017-01-27Bibliographically approved
Danylenko, A. (2011). Decisions: Algebra and Implementation. (Licentiate dissertation). Växjö: School of Computer Science, Physics and Mathematics, Linnaeus University
Open this publication in new window or tab >>Decisions: Algebra and Implementation
2011 (English)Licentiate thesis, monograph (Other academic)
Abstract [en]

Processing decision information is a constitutive part in a number of applicationsin Computer Science fields. In general, decision information can be used to deduce the relationship between a certain context and a certain decision. Decision information is represented by a decision model that captures this information. Frequently used examples of decision models are decision tables and decision trees. The choice of an appropriate decision model has an impact on application performance in terms of memory consumption and execution time. High memory expenses can possibly occur due to redundancy in a decision model; and high execution time is often a consequence of an unsuitable decision model.

Applications in different domains try to overcome these problems by introducing new data structures or algorithms for implementing decision models. These solutions are usually domain-specificand hard to transfer from one domain to another. Different application domains of Computer Science often process decision information in a similar way and, hence, have similar problems. We should thus be able to present a unifying approach that can be applicable in all application domains for capturing and manipulating decision information. Therefore, the goal of this thesis is (i) to suggest a general structure(Decision Algebra) which provides a common theoretical framework that captures decision information and defines operations (signatures) for storing, accessing, merging, approximating, and manipulating such information along with some general algebraic laws regardless of the used implementation. Our Decision Algebra allows defining different construction strategiesfor decision models and data structures that capture decision information as implementation variants, and it simplifies experimental comparisons between them.

Additionally, this thesis presents (ii) an implementation of Decision Algebra capturing the information in a non-redundant way and performing the operations efficiently. In fact, we show that existing decision models that originated in the field of Data Mining and Machine Learning and variants thereof as exploited in special algorithms can be understood as alternative implementation variants of the Decision Algebra by varying the implementations of the Decision Algebra operations. Hence, this work (iii) will contribute to a classification of existing technology for processing decision information in different application domains of Computer Science.

Place, publisher, year, edition, pages
Växjö: School of Computer Science, Physics and Mathematics, Linnaeus University, 2011. p. 131
Keywords
Classification, decision algebra, decision function, decision information, learning
National Category
Computer Sciences
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-16283 (URN)
Presentation
2011-08-22, 15:39 (English)
Opponent
Supervisors
Note

A thesis for the Degree of Licentiate of Philosophy in Computer Science.

Available from: 2011-12-22 Created: 2011-12-21 Last updated: 2018-01-12Bibliographically approved
Danylenko, A., Lundberg, J. & Löwe, W. (2011). Decisions: Algebra and Implementation. In: Perner, Petra (Ed.), Petra Perner (Ed.), Machine Learning and Data Mining in Pattern Recognition: 7th International Conference on Machine Learning and Data Mining in Pattern Recognition, MLDM 2011, New York, NY, USA, August/September 2011, Proceedings. Paper presented at 7th International Conference on Machine Learning and Data Mining (MLDM 2011) (pp. 31-45). Paper presented at 7th International Conference on Machine Learning and Data Mining (MLDM 2011). Berlin, Heidelberg: Springer, 6871
Open this publication in new window or tab >>Decisions: Algebra and Implementation
2011 (English)In: Machine Learning and Data Mining in Pattern Recognition: 7th International Conference on Machine Learning and Data Mining in Pattern Recognition, MLDM 2011, New York, NY, USA, August/September 2011, Proceedings / [ed] Perner, Petra, Berlin, Heidelberg: Springer, 2011, Vol. 6871, p. 31-45Chapter in book (Refereed)
Abstract [en]

This paper presents a generalized theory for capturing and manipulating classification information. We define decision algebra which models decision-based classifiers as higher order decision functions abstracting from implementations using decision trees (or similar), decision rules, and decision tables. As a proof of the decision algebra concept we compare decision trees with decision graphs, yet another instantiation of the proposed theoretical framework, which implement the decision algebra operations efficiently and capture classification information in a non-redundant way. Compared to classical decision tree implementations, decision graphs gain learning and classification speed up to 20% without accuracy loss and reduce memory consumption by 44%. This is confirmed by experiments.

Place, publisher, year, edition, pages
Berlin, Heidelberg: Springer, 2011
Series
Lecture Notes in Computer Science, ISSN 0302-9743 ; Volume 6871
National Category
Computer Sciences
Research subject
Computer and Information Sciences Computer Science, Computer Science
Identifiers
urn:nbn:se:lnu:diva-16287 (URN)10.1007/978-3-642-23199-5_3 (DOI)2-s2.0-80052323693 (Scopus ID)978-3-642-23198-8 (ISBN)978-3-642-23199-5 (ISBN)
Conference
7th International Conference on Machine Learning and Data Mining (MLDM 2011)
Available from: 2011-12-21 Created: 2011-12-21 Last updated: 2018-05-17Bibliographically approved
Khairova, A., Löwe, W. & Lundberg, J. (2010). Decision algebras. Capturing and manipulating decision information: Doctoral Forum poster.
Open this publication in new window or tab >>Decision algebras. Capturing and manipulating decision information: Doctoral Forum poster
2010 (English)Other (Other academic)
National Category
Mathematics
Identifiers
urn:nbn:se:lnu:diva-7189 (URN)
Available from: 2010-08-18 Created: 2010-08-13 Last updated: 2018-05-17Bibliographically approved
Organisations

Search in DiVA

Show all publications