ПРЕПРИНТ

Эта статья является препринтом и не была отрецензирована.
О результатах, изложенных в препринтах, не следует сообщать в СМИ как о проверенной информации.
Optimizing ML Training with Perturbed Equations
2025-12-12

Optimizing ML Training with Perturbed Equations explores how introducing controlled perturbations into optimization equations can improve the stability, generalization, and convergence of machine-learning models. Perturbed differential-equation–based training methods help smooth the loss landscape, prevent models from falling into sharp minima, and enhance robustness to noise. By integrating stochastic or deterministic perturbations into gradient dynamics, these approaches enable more efficient training, faster convergence, and improved performance on complex tasks.

Ссылка для цитирования:

Eshbolotova S. 2025. Optimizing ML Training with Perturbed Equations. PREPRINTS.RU. https://doi.org/10.24108/preprints-3114061

Список литературы