Aiming to decrease the learning error of the series of AdaBoost algorithm for multi-label classification,the AdaBoost algorithm was improved for multi-label classification by two strategies.One idea is to modify the adjustment strategy of sample distribution,and destroy the sample uniform distribution in the existing AdaBoost algorithm,in order to ensure that the increase of every weak classifier can reduce the learning error bound estimation.Another idea is to consider the effect of subsequent weak classifiers to decrease the learning error when training current weak classifier,which is different from the existing AdaBoost algorithm.Theoretically,the improved AdaBoost algorithms for multi-label classification increase every weak classifier to reduce more learning error.Theoretical analysis and experimental results showed that all the improved algorithms are effective.