As one of the ensemble methods, Mixture of Experts is used to gain higher prediction performance in classification area. In this technique, a dataset is divided into regions by soft clustering. An expert for each region is assigned and trained with the samples of the corresponding region. In this study, a dataset is divided into regions by a hard clustering method and a decision tree classifier is constructed for each region as an expert. Each expert has their individual decision for a test point. The class prediction for the test point is performed by the proposed gate function which aggregates all the decisions of experts. In calculations of the final decision via the gate function, each expert has different weights according to its distance to the test point. In the experiments on a group of benchmark datasets, better performance has been obtained with the new gate function. This new method gives better results than base classifiers such as decision trees and k Nearest Neighbors.