Gradient-based optimization methods represent a cornerstone in the field of numerical optimization, offering powerful techniques to minimize or maximize objective functions in various domains, ranging from machine learning and deep learning to physics and engineering. These methods leverage the gradient, or derivative, of the objective function with respect to its parameters to iteratively update them in a direction that reduces the function's value.