The minimum squared error (MSE) for classification is a linear discriminant function-based method that has been used in many applications such as face and handwritten character recognition. Nevertheless, MSE may not deal well with nonlinearly separable data sets. To address this problem, we improve the MSE and propose a new MSE-based algorithm, local MSE (LMSE), which is a local learning algorithm. For a test sample, we first determine its nearest neighbors from the training set. By using the determined neighbors, we construct a local MSE model to predict the class label of the test sample. LMSE can effectively capture the nonlinear structure of the data. It generally outperforms MSE, particularly when the data distribution is nonlinearly separable. Extensive experiments on many nonlinearly separable data sets show that LMSE achieves desirable recognition results.