Classification in mixture of experts using hard clustering and a new gate function


Creative Commons License

Bulut F., Amasyalı M. F.

JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY, vol.31, pp.1017-1025, 2016 (Journal Indexed in SCI) identifier identifier

  • Publication Type: Article / Article
  • Volume: 31
  • Publication Date: 2016
  • Doi Number: 10.17341/gazimmfd.278457
  • Title of Journal : JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY
  • Page Numbers: pp.1017-1025

Abstract

As one of the ensemble methods, Mixture of Experts is used to gain higher prediction performance in classification area. In this technique, a dataset is divided into regions by soft clustering. An expert for each region is assigned and trained with the samples of the corresponding region. In this study, a dataset is divided into regions by a hard clustering method and a decision tree classifier is constructed for each region as an expert. Each expert has their individual decision for a test point. The class prediction for the test point is performed by the proposed gate function which aggregates all the decisions of experts. In calculations of the final decision via the gate function, each expert has different weights according to its distance to the test point. In the experiments on a group of benchmark datasets, better performance has been obtained with the new gate function. This new method gives better results than base classifiers such as decision trees and k Nearest Neighbors.