AdaBoost is a machine learning technique which integrates many weak classifiers into one strong classifier to
enhance its classification performance. Gentle AdaBoost is a variant of AdaBoost which introduces Newton steps to the
boosting process. It is proved that the overall performance considering both the training error and generalization error of
Gentle AdaBoost is better than other AdaBoost variants on low-noise data. However, it suffers from overfitting problem
when the training data include high noise. To solve this problem, we propose a new approach to limit the weight
distortion according to a stretched distribution of the whole sample weights. Experimental results have shown that our
algorithm obtains a better generalization error on both standard and noise-input datasets. Moreover, our method does not
increase the calculation time compared with Gentle AdaBoost.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.