lnu.sePublications
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Merging Classifiers of Different Classification Approaches
Linnaeus University, Faculty of Technology, Department of Computer Science.
Linnaeus University, Faculty of Technology, Department of Computer Science. (Software Technology Labs)ORCID iD: 0000-0002-7565-3714
2014 (English)In: 2014 IEEE International Conference on Data Mining Workshop (ICDMW), IEEE Press, 2014, p. 706-715Conference paper, Published paper (Refereed)
Abstract [en]

Classification approaches, e.g. Decision trees or Naive Bayesian classifiers, are often tightly coupled to learning strategies, special data structures, the type of information captured, and to how common problems, e.g. Over fitting, are addressed. This prevents a simple combination of classifiers of differentclassification approaches learned over different data sets. Many different methods of combiningclassification models have been proposed. However, most of them are based on a combination of the actual result of classification rather then producing a new, possibly more accurate, classifier capturing the combined classification information. In this paper we propose a new general approach to combiningdifferent classification models based on a concept of Decision Algebra which provides a unified formalization of classification approaches as higher order decision functions. It defines a general combining operation, referred to as merge operation, abstracting from implementation details of differentclassifiers. We show that the combination of a series of probably accurate decision functions (regardless of the actual implementation) is even more accurate. This can be exploited, e.g., For distributed learning and for efficient general online learning. We support our results by combining a series of decision graphs and Naive Bayesian classifiers learned from random samples of the data sets. The result shows that on each step the accuracy of the combined classifier increases, with a total accuracy growth of up to 17%.

Place, publisher, year, edition, pages
IEEE Press, 2014. p. 706-715
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:lnu:diva-42471DOI: 10.1109/ICDMW.2014.64ISI: 000389255100096Scopus ID: 2-s2.0-84936871579ISBN: 978-1-4799-4275-6 (print)OAI: oai:DiVA.org:lnu-42471DiVA, id: diva2:805588
Conference
IEEE International Conference on Data Mining Workshops, (ICDM) Workshops, Shenzhen, China, December 14, 2014
Available from: 2015-04-15 Created: 2015-04-15 Last updated: 2018-02-16Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full textScopus

Authority records

Danylenko, AntoninaLöwe, Welf

Search in DiVA

By author/editor
Danylenko, AntoninaLöwe, Welf
By organisation
Department of Computer Science
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 164 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf