Aiming at conquering the insufficiency that AdaBoost algorithm is only suitable to the unstable learning algorithm, the method of adjusting sample center with its weight was given based on the idea that adding new classifiers is always to reduce the training error of the ensemble classifier. By this method, AdaBoost algorithm could be generalized to be several new ensemble learning methods by combining some stable learning algorithms, such as the one of dynamically adjusting the centers of sample attributes, the one of classifying by weighted distance measurement, and the one of dynamically combining sample attributes. Therefore, the application scope of AdaBoost algorithm was greatly expanded. Different from that the combination coefficients and the adjustment strategy of sample weights in AdaBoost algorithm are indirectly set to reduce the training error, the direct goal-oriented ensemble learning algorithm was given. The experimental analysis on UCI dataset proves that the generalized AdaBoost algorithms are effective and some of them perform better than the ordinary AdaBoost algorithm.