Beyond Local Minima: Enhancing Deep Models Through Bezier Curve Interpolation Lokal Minimumlarin tesi: Bezier Egrileri ile Derin grenme Modellerinin Iyilestirilmesi


Yildiz A. F., AMASYALI M. F.

2025 Innovations in Intelligent Systems and Applications Conference, ASYU 2025, Bursa, Türkiye, 10 - 12 Eylül 2025, (Tam Metin Bildiri) identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Doi Numarası: 10.1109/asyu67174.2025.11208476
  • Basıldığı Şehir: Bursa
  • Basıldığı Ülke: Türkiye
  • Anahtar Kelimeler: bezier curve, local solution refinement, loss surface, mode connectivity, neural network optimization, nonlinear interpolation
  • Yıldız Teknik Üniversitesi Adresli: Evet

Özet

Deep neural networks are typically trained using gradient-based optimization algorithms, with the goal of reaching a minimum in the parameter space. However, the resulting solutions often converge to local, rather than global, minima. Recent studies in the literature have shown that non-linear interpolation paths between independently trained models in the same parameter space can preserve test performance. This study goes a step further and demonstrates that it is possible to find even better-performing points (models) along these interpolation curves. Experiments conducted using various architectures and datasets show that Bezier curve interpolation can yield solutions with lower loss, especially near the endpoints of the curve, outperforming the original trained models at those points. Furthermore, it is shown that iterative application of the proposed approach can lead to progressively better models. These findings suggest that geometric search strategies defined by parametric curves may enhance traditional gradient-based methods used in neural network optimization, highlighting a promising and exciting direction for future research.