Science

New logarithmic step size for stochastic gradient descent

Credit: M. Soheil Shamaee, S. Fathi Hafshejani, Z. Saeidian

The step measurement, also known as the training fee, performs a pivotal position in optimizing the effectivity of the stochastic gradient descent (SGD) algorithm. In latest instances, a number of step measurement methods have emerged for enhancing SGD efficiency. Nonetheless, a big problem related to these step sizes is said to their chance distribution, denoted as ηt/ΣTt=1ηt .

This distribution has been noticed to keep away from assigning exceedingly small values to the ultimate iterations. For example, the extensively used cosine step measurement, whereas efficient in practice, encounters this concern by assigning very low probability distribution values to the final iterations.

To deal with this problem, a analysis workforce led by M. Soheil Shamaee printed their research in Frontiers of Laptop Science.

The workforce introduces a brand new logarithmic step measurement for the SGD method. This new step measurement has confirmed to be notably efficient in the course of the closing iterations, the place it enjoys a considerably greater chance of choice in comparison with the standard cosine step measurement.

Consequently, the brand new step measurement technique surpasses the efficiency of the cosine step measurement technique in these important concluding iterations, benefiting from their elevated chance of being chosen as the chosen resolution. The obtained numerical outcomes function a testomony to the effectivity of the newly proposed step measurement, notably on the FashionMinst, CIFAR10, and CIFAR100 datasets.

Moreover, the brand new logarithmic step measurement has proven exceptional enhancements in check accuracy, attaining a 0.9% improve for the CIFAR100 dataset when utilized with a convolutional neural community (CNN) mannequin.

Extra data:
New logarithmic step measurement for stochastic gradient descent, Frontiers of Laptop Science (2024). DOI: 10.1007/s11704-023-3245-z. journal.hep.com.cn/fcs/EN/10.1 … 07/s11704-023-3245-z

Offered by
Increased Education Press

Quotation:
New logarithmic step measurement for stochastic gradient descent (2024, April 22)
retrieved 26 April 2024
from https://techxplore.com/information/2024-04-logarithmic-size-stochastic-gradient-descent.html

This doc is topic to copyright. Aside from any honest dealing for the aim of personal examine or analysis, no
half could also be reproduced with out the written permission. The content material is offered for data functions solely.



Click Here To Join Our Telegram Channel


Source link

You probably have any considerations or complaints concerning this text, please tell us and the article will probably be eliminated quickly. 

Raise A Concern

Show More

Related Articles

Back to top button