v.19.13New Feature

Adam Optimizer for Stochastic Gradient Descent Is Used by Default in Stochasticlinearregression() and Stochasticlogisticregression() Aggregate Functions, Because It Shows Good Quality Without Almost Any Tuning

Adam optimizer for stochastic gradient descent is used by default in stochasticLinearRegression() and stochasticLogisticRegression() aggregate functions, because it shows good quality without almost any tuning. #6000 (Quid37)
Adam optimizer is now the default optimization algorithm used by stochasticLinearRegression() and stochasticLogisticRegression() aggregate functions in ClickHouse.

Why it matters

The change aims to improve the quality of results from these stochastic gradient descent functions by using the Adam optimizer, which provides good performance and convergence without requiring significant parameter tuning from users.

How to use it

Users benefit from the Adam optimizer automatically, as it is set as the default within stochasticLinearRegression() and stochasticLogisticRegression() aggregate functions without any additional configuration required.