AN ENTROPY BASED ANALYSIS OF CLASS DISTRIBUTIONS USING TSALLIS ENTROPY


Creative Commons License

Oruç Ünal N., Yıldız D., Göztaş M.

2nd INTERNATIONAL CONGRESS ON NATURAL SCIENCES AND APPLIED MATHEMATICS, İstanbul, Türkiye, 7 - 08 Şubat 2026, ss.41, (Özet Bildiri)

  • Yayın Türü: Bildiri / Özet Bildiri
  • Doi Numarası: 10.5281/zenodo.18640991
  • Basıldığı Şehir: İstanbul
  • Basıldığı Ülke: Türkiye
  • Sayfa Sayıları: ss.41
  • Yıldız Teknik Üniversitesi Adresli: Evet

Özet

Shannon entropy is considered the most common and established measure of uncertainty in the literature. However, the assumption that Shannon entropy is the universally best measure of uncertainty for every data structure is not always valid. Especially when class distributions exhibit unbalanced, skewed, or overlapping structures, evaluations based on a single entropy measure may be limited. In this context, whether Tsallis entropy, which has an adjustable parameter, can reflect the structural characteristics of class distributions more accurately than Shannon entropy constitutes the main research question of this study. In this study, a comparative analysis was performed using the Palmerpenguins dataset. To obtain probability distributions from continuous variables, an equal frequency discretization approach was adopted to avoid parametric assumptions. After this process, conditional class probabilities were calculated; Shannon entropy and Tsallis entropy were calculated from the resulting distributions under different parameter values of q. To comprehensively evaluate the variable based uncertainty levels, the entropy values were combined using weighted averages. The findings reveal that Shannon entropy exhibits a relatively stricter and limited sensitivity to class structure, whereas Tsallis entropy shows a high degree of sensitivity to features such as dominance, imbalance, and heterogeneity in class distributions. In particular, in cases where classes are distinctly separated, sharp decreases in Tsallis entropy values were observed with increasing q parameter. Conversely, in more complex structures with significant overlap between classes, Tsallis entropy was found to more effectively capture uncertainty components that Shannon entropy could not adequately distinguish. In conclusion, it can be said that thanks to its adjustable nature, Tsallis entropy offers a more flexible and explanatory measure of uncertainty, especially in datasets with imbalanced and complex class distributions. This feature makes Tsallis entropy a more sensitive and generalizable alternative compared to traditional entropy measures.