Saul Kripke once noted that there is a tight connection between computation and de re knowledge of whatever the computation acts upon. For example, the Euclidean algorithm can produce knowledge of _which number_ is the greatest common divisor of two numbers. Arguably, algorithms operate directly on syntactic items, such as strings, and on numbers and the like only via how the numbers are represented. So we broach matters of _notation_. The purpose of this article is to explore the relationship between the notations acceptable for computation, the usual idealizations involved in theories of computability, flowing from Alan Turing’s monumental work, and de re propositional attitudes toward numbers and other mathematical objects.