Adrian: once you create a dynamic site that is actually cache-friendly (or spider-friendly), it's not just the same benefits, it's pretty much the same thing as a static site, isn't it? Well, I guess a truly static site is event-based (you actively rerender pages), where a cached site isn't (the proxy figures out that pages have updated over time). The differences become subtle. In the spidering version, you've really just replaced a push (file write) with a triggered pull (from the spider). Either way, how you actually move the files around -- cache, pull, push, file write, FTP upload, SCP upload, and so on -- should hopefully be somewhat abstracted from the CMS, since requirements vary considerably. Upon further thought, I think the distinguishing aspect is render-on-demand, vs. render-on-event.
Georg: yes, static is limited. The issue we're dealing with a lot is the workflow of publishing a site, and pieces that don't fit into the workflow usually wouldn't be put in the CMS. Though the more I work with the system, the more I see places where workflow applies. You could add or edit an item in your store, preview, have the copy reviewed, etc., then publish it to your site. But obviously a store item is much richer than your typical HTML file -- or at least has different set of metadata. And it has to interact with entirely non-static items, like a shopping cart, or inventory control. Some things don't make sense as event-based, or the events occur with a granularity that doesn't fit your model. When writing the CMS, I expect data to be updated on the order of a few items a day, or maybe the occassional big batch. An online forum wouldn't fit into this model at all. Anyway, it's my first foray into workflow, so I'm still thinking about the possibilities, and what it means for all the different parties involved.
Also, one of the benefits of publishing into a system like Apache is that it's a good basis for heterogeneous sites, so I'm inclined to simply call upon external applications in these cases.