Sparse representation has been successfully applied to pattern recognition problems in recent years. The most common way for producing sparse coding is to use the -norm regularization. However, the -norm regularization only favors sparsity and does not consider locality. It may select quite different bases for similar samples to favor sparsity, which is disadvantageous to classification. Besides, solving the -minimization problem is time consuming, which limits its applications in large-scale problems. We propose an improved algorithm for sparse coding and dictionary learning. This algorithm takes both sparsity and locality into consideration. It selects part of the dictionary columns that are close to the input sample for coding and imposes locality constraint on these selected dictionary columns to obtain discriminative coding for classification. Because an analytic solution of the coding is derived by only using part of the dictionary columns, the proposed algorithm is much faster than the -based algorithms for classification. Besides, we also derive an analytic solution for updating the dictionary in the training process. Experiments conducted on five face databases show that the proposed algorithm has better performance than the competing algorithms in terms of accuracy and efficiency.