lnu.sePublikationer
Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Merging Classifiers of Different Classification Approaches
Linnéuniversitetet, Fakulteten för teknik (FTK), Institutionen för datavetenskap (DV).
Linnéuniversitetet, Fakulteten för teknik (FTK), Institutionen för datavetenskap (DV). (Software Technology Labs)ORCID-id: 0000-0002-7565-3714
2014 (Engelska)Ingår i: 2014 IEEE International Conference on Data Mining Workshop (ICDMW), IEEE Press, 2014, s. 706-715Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

Classification approaches, e.g. Decision trees or Naive Bayesian classifiers, are often tightly coupled to learning strategies, special data structures, the type of information captured, and to how common problems, e.g. Over fitting, are addressed. This prevents a simple combination of classifiers of differentclassification approaches learned over different data sets. Many different methods of combiningclassification models have been proposed. However, most of them are based on a combination of the actual result of classification rather then producing a new, possibly more accurate, classifier capturing the combined classification information. In this paper we propose a new general approach to combiningdifferent classification models based on a concept of Decision Algebra which provides a unified formalization of classification approaches as higher order decision functions. It defines a general combining operation, referred to as merge operation, abstracting from implementation details of differentclassifiers. We show that the combination of a series of probably accurate decision functions (regardless of the actual implementation) is even more accurate. This can be exploited, e.g., For distributed learning and for efficient general online learning. We support our results by combining a series of decision graphs and Naive Bayesian classifiers learned from random samples of the data sets. The result shows that on each step the accuracy of the combined classifier increases, with a total accuracy growth of up to 17%.

Ort, förlag, år, upplaga, sidor
IEEE Press, 2014. s. 706-715
Nationell ämneskategori
Datavetenskap (datalogi)
Identifikatorer
URN: urn:nbn:se:lnu:diva-42471DOI: 10.1109/ICDMW.2014.64ISI: 000389255100096Scopus ID: 2-s2.0-84936871579ISBN: 978-1-4799-4275-6 (tryckt)OAI: oai:DiVA.org:lnu-42471DiVA, id: diva2:805588
Konferens
IEEE International Conference on Data Mining Workshops, (ICDM) Workshops, Shenzhen, China, December 14, 2014
Tillgänglig från: 2015-04-15 Skapad: 2015-04-15 Senast uppdaterad: 2018-02-16Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Person

Danylenko, AntoninaLöwe, Welf

Sök vidare i DiVA

Av författaren/redaktören
Danylenko, AntoninaLöwe, Welf
Av organisationen
Institutionen för datavetenskap (DV)
Datavetenskap (datalogi)

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 164 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf