Ian Bicking: the old part of his blog

Already under our noses?

Reading The Futility of Adding Types to a Dynamic Language (via), I realize we may already have all the mechanisms we need in Python. (Peter Lount seems to have a bunch of apropos articles there, but no syndication feed...?)

Anyway, I haven't read through them all, but this made me think:

Validations, all forms of them, are really a form of "meta data" about the program. "Meta data" are program statements (using a special syntax or using the general purpose language syntax) about the program itself. As Smalltalk has demonstrated the most powerful and general method of having meta data in a language is to have that meta data written in the general purpose syntax of the language itself. In this way meta data programming is the same as regular programming and many more programmers will be able to access this powerful dynamic capability. In Smalltalk style languages the "meta data" are first class objects. The impact of this is that much of the added meta data functionality can be written in the language itself and added as a group of "classes" (or objects) otherwise known as a Framework.

OK, well... I don't like Frameworks. But we have something else; everyone's favorite: decorators!

Everyone knows (or should know) about the type-checking decorators. These are simple, work right now, and really don't look bad at all:

@require(int, int)
def gcd(a, b):
    ...

It's rather simplistic, but the nice part is that it's easily implemented and easily reimplemented. If it was to become Part Of Python (i.e., included in the standard library, not in the syntax) then no doubt there's be a lot of debate and work on it, with respect to interfaces, sequences, adapters, etc., as well as extensibility.

We also want inspectable metadata as well, not just imperative validation at runtime. require can be modified to set known attributes on the function (e.g., __argspec__), or maybe something more sophisticated. Things like auto-completion would still have to load modules (and execute those decorators) to infer types and signatures, but I think that's okay. We should just get rid of modules that can't be safely loaded, they are a pain in the butt for everyone (including their authors); much better that than static typing!

Created 06 Jan '05
Modified 15 Jan '05

Comments:

Even better, you could start with a stdlib module before you went to the builtin. I've always cringed at the thought of a "decorators" module as some have proposed (which seems to me about as meaningful as a module called "functions" or "objects"). But a module to collect type-checking techniques would be ideal. You could then provide the same tools for earlier versions of Python (pre-decorators, pre-unified-types, etc.). Various names present themselves. Perhaps it should be a submodule of pychecker? >:)
# Robert Brewer

Maybe the decorators could be part of the module that implemented interfaces, since we sure should get interfaces into the standard library before we start thinking about type declarations anyway.
# Ian Bicking

Using decorators for type signatures for callable arguments has an important benefit: because of the extra typing involved, and being a separate line from the 'def' statement line, the 'syntax' alone would suggest to Python 'newbies' to only use them if really needed.

I like everything about optional argument type signatures for callables, except the idea of reading the code from Java programmers beginning to use Python. They would use it on every darn thing under the sun.

As a separate line, at least it is trivial to 're-factor' the code to remove unnecessary type signatures: just delete the line.

Kudos on using "restructured text" as the markup mechanism! Great stuff!

# Manuel Garcia