Ian Bicking: the old part of his blog

Packaging python comment 000

However, another good indication of the fact that deployers deal with libraries directly is the phenomenon of library configuration; for example, /etc/fonts/fonts.conf - someone deploying a GNOME desktop will edit that file, and it doesn't configure the X server "application" - it is a common configuration file read by every application that uses fontconfig, generally invoked by the Xft library.

That's an example of a situation where discovery and registration of resources is necessary. That is definitely a problem with bundled software, though it is not trivial in a centralized system either. /etc/fonts/fonts.conf has policy associated with it, and various scripts that do things to that file -- all of which must work properly for the entire system to work properly -- and all as an augmentation of the (fairly slow) index of package metadata that exists elsewhere.

It's harder with software bundles, but I don't think that isolated installs need to be entirely isolated from the system either. I think it's better to start from the default of fully insulated and add conventions from there. It's not easy regardless of what you are doing.

(By the way: fontconfig is about 1M all told, so although disk space isn't as big a deal as it used to be, all those packages would have 1G of just copies of fontconfig. Once you add in the inevitable X and Gnome dependencies in each package too, that number would explode into impracticality really fast.)

Perhaps caching of shared content needs to be a central concept of this. Something I've meant to add to working-env.py, for instance, is a way of linking in libraries from elsewhere -- probably using Setuptools .egg-link files (which are largely equivalent to platform independent symlinks). In that case it would an opt-in sharing -- which is better than implicit sharing without any opt-out option at all (except for careful manipulation and stacking of the entries on sys.path). But a more implicit sharing of resources that are identical would be possible.

I don't think fontconfig is actually an example of a resource at all, but the problem certainly exists. I'm also not sure how far down to push this isolation. Right now most Python applications depend on a set of libraries that mostly should be bundled. Should everything be bundled everywhere? Thinking about what that would mean would be an interesting thought experiment ;) I'm not even sure what it would look like.

Deployers can use to configure and customize large groups of applications at a time. In the best case, a library can allow you to tweak its behavior independently of an application, to affect an application's behavior (or many applications' behavior) without the applications having explicitly coded any support for it.

Applications should delegate to their component libraries when possible and reasonable, and let information pass down. This usually has nothing to do with the packaging used. Applications would still use libraries, and those libraries can still look at their environment; nothing changes with respect to that.

Comment on Re: Packaging python comment 000
by Ian Bicking


Applications should delegate to their component libraries when possible and reasonable, and let information pass down. This usually has nothing to do with the packaging used.

In fact it does. Default configuration is very much part of the library package, at least on debian.

Applications would still use libraries, and those libraries can still look at their environment; nothing changes with respect to that.

The thing that changes with respect to that is that, within a compatible version of a library, the format of the system configuration for the library may change, or features may be added, without necessarily alerting applications to that fact. Or, an entirely new version might be released, which provides a compatibility later.

Now, there are ways to design around this, future-proofing your format etc, but demonstrably library authors do not always do this. Configuration formats do change, and will continue to change whether application authors start bundling everything under the sun with their application or not.

Right now the user experience of this is, you upgrade the library, and Debian prompts you if you want to upgrade your system config file. You (and your users) have to infer that you also must upgrade files under ~/. as well, but at least one upgrade to that file and you're done with it. If every application packages every library, all of a sudden you've got 1091 copies of fonts.conf, under /etc/gaim/fonts/fonts.conf, /etc/gimp/fonts/fonts.conf, and you have to track the version of fontconfig used by every single one of those apps manually. Even if you make no modifications, the package maintainers for each of those applications suddenly has to become a fontconfig expert, whereas before they didn't even have to know this file existed.

If you include .egg-link files with your application that "link" to other libraries, how is that different from an import statement "linking" to another library? It's just adding additional work to your import lines. What if I want to write a plugin for application X which imports a library from application Y? What is my "application"? How do I install under package X in such a way that I can then "link" to package Y? Perhaps each project should also come with an XML config file which describes all its dependencies? The pygtk project had thoughts along these lines before, and their solution has mainly made people unhappy: http://www.tortall.net/mu/blog/2006/01/18/pyxyz_require_sense

I've been talking a lot about random C libraries as if they might be in Python (and I hope in the future more will) but let me speak directly and practically about Python as it is now. The status quo may not be perfect, but it effortlessly allows me to write python plugins for nautilus or gaim which import twisted, gtk, vte, sqlite, ogg, BitTorrent, or any other library on my system. It seems you want to break that by unbundling every library (I have over 100 python libraries installed through Ubuntu's packaging system, and a half-dozen installed in my user environment) from my system, and putting it into the applications which use it, and making me re-install or re-declare the use of those libraries in my "working environment" for the plugin, all apparently to prevent some hypothetical breakage. (Is the plugin an "application"? How does its environment differ from that of the (usually non-Python) application it's hosted in?)

To get to the bottom of this, though: what's the real problem you're trying to solve here? Is it just making side-by-side installation so that applications don't break when subtly different versions of libraries are installed?

# Glyph Lefkowitz