This is pretty fast problem solving. But it doesn't match reality.
Look for example at xpdf; its code is duplicated in gpdf, kpdf, some command-line utilities, poppler, I am probably missing some. Waiting for all of them to update to the newer xpdf code is unrealistic. It has been shown as not working and causing delays.
You write they should all be updated. While only one single package could. I prefer that second option.
This is something that should be handled with proper tool support.Look for example at xpdf; its code is duplicated in gpdf, kpdf, some command-line utilities, poppler, I am probably missing some. Waiting for all of them to update to the newer xpdf code is unrealistic.
Why is it unrealistic? Because updating the code causes problems in those packages? Updating the code via dynamic linking doesn't improve that any. Because the maintainer isn't sufficiently available to make the update in time, or in a coordinated fashion? Debian does non-maintainer updates to solve this sort of problem; that kind of update just needs to be done more widely.# Ian Bicking
It is not easy because gpdf, kpdf, etc. don't have much knowledge about the xpdf code they ship; so it takes time.
And the developers who will bundle many Python packages to get an "application package" won't know details about all those Python packages, same problem.
Even worse it gets much harder for an entity that cares (say Debian) to apply the fixes; 1) they must be applied in many different places and 2) those different places have different versions.
As for the tools to handle all of this, they do not exist for the moment.
IIRC you had slides opposing developers and packagers ("deployers", maintainers, sysadmins, whatever the name), iirc. All of this, For development of applications, much more comfortable and flexible for the developer, isolated development environments concerns developers. Please don't forget the other side.