v.19.13New Feature
Adam Optimizer for Stochastic Gradient Descent Is Used by Default in Stochasticlinearregression() and Stochasticlogisticregression() Aggregate Functions, Because It Shows Good Quality Without Almost Any Tuning
Adam optimizer for stochastic gradient descent is used by default instochasticLinearRegression()andstochasticLogisticRegression()aggregate functions, because it shows good quality without almost any tuning. #6000 (Quid37)
Why it matters
The change aims to improve the quality of results from these stochastic gradient descent functions by using the Adam optimizer, which provides good performance and convergence without requiring significant parameter tuning from users.How to use it
Users benefit from the Adam optimizer automatically, as it is set as the default withinstochasticLinearRegression() and stochasticLogisticRegression() aggregate functions without any additional configuration required.