In object recognition, soft-assignment coding enjoyscomputational efficiency and conceptual simplicity. How-ever, its classification performance is inferior to the newlydeveloped sparse or local coding schemes. It would behighly desirable if its classification performance could be-come comparable to the state-of-the-art, leading to a codingscheme which perfectly combines computational efficiencyand classification performance. To achieve this, we revisitsoft-assignment coding from two key aspects: classificationperformance and probabilistic interpretation. For the firstaspect, we argue that the inferiority of soft-assignment cod-ing is due to its neglect of the underlying manifold structureof local features. To remedy this, we propose a simple mod-ification to localize the soft-assignment coding, which sur-prisingly achieves comparable or even better performancethan existing sparse or local coding schemes while main-taining its computational advantage. For the second as-pect, based on our probabilistic interpretation of the soft-assignment coding, we give a probabilistic explanation tothe magic max-pooling operation, which has successfullybeen used by sparse or local coding schemes but still poorlyunderstood. This probability explanation motivates us todevelop a new mix-order max-pooling operation which fur-ther improves the classification performance of the pro-posed coding scheme. As experimentally demonstrated, thelocalized soft-assignment coding achieves the state-of-the-art classification performance with the highest computa-tional efficiency among the existing coding schemes.