One often finds computation described as having
originated with Alan Turing and as symbol manipulation. Yet, Turing imposed certain sorts of
finiteness conditions on computations, such as that a program must be finite in
length. These finiteness conditions are
not captured in the description of computation as symbol manipulation. So, it looks like you cannot both define
computation as symbol manipulation and say that it is Turing’s definition. This looks to be taking a mere feature of Turing’s formulation of a
computational formalism to be his definition.
I think that Gualtiero, for one, at least flirts with this problem in some of his papers on computation.