It is common in cognitive science to equate computation (in particular digital computation) with information processing. Yet, it is hard to find a comprehensive explicit account of concrete digital computation in information processing terms. An Information Processing account seems like a natural candidate to explain digital computation. After all, digital computers traffic in data. But when 'information' comes under scrutiny, this account becomes a less obvious candidate. 'Information' may be interpreted semantically or nonsemantically, and its interpretation has direct implications for Information Processing as an objective account of digital computation. This paper deals with the implications of these interpretations for explaining concrete digital computation in terms of information processing. To begin with, I survey Shannon's classic theory of information, and then examine how 'information' is used in computer science. In the subsequent section, I evaluate the implications of how 'information' is interpreted for an Information Processing account. The key requirements for a physical system to compute are then fleshed out, as well as some of the limitations of such an account. Any Information Processing account must embrace an algorithm- theoretic apparatus to be a plausible candidate for explaining concrete digital computation.