收录:
摘要:
Boosting has been extensively used in image processing. Many work focuses on the design or the usage of boosting, but training boosting on large-scale datasets tends to be ignored. To handle the large-scale problem, we present stochastic boosting (StocBoost) that relies on stochastic gradient descent (SGD) which uses one sample at each iteration. To understand the efficacy of StocBoost, the convergence of training algorithm is theoretically analyzed. Experimental results show that StocBoost is faster than the batch ones, and is also comparable with the state-of-the-arts.
关键词:
通讯作者信息:
电子邮件地址: