The role of operating systems is interesting to think about. I got to thinking about it after reading Rob Pike's Systems Software Research is Irrelevant (via).
As he notes, there aren't many operating systems left. Mac OS is gone, replaced by NeXTSTEP, which is BSD with a different windowing system. There's still a number of Unixy systems, but the practical diversity in that realm is decreasing. And there's the Windows family.
Every so often people try to make a new operating system. But it seems like a hard thing to justify. Most of the development is done in an emulator (unless you are masochistic), so you are still running on some other OS. It doesn't take long to realize that running your OS on the bare hardware doesn't actually provide many benefits over the emulator itself. So why make an operating system at all? Or, why not treat an existing OS as your bare hardware?
An interesting parallel might be in CPU development. At some point x86 was no longer an instruction set, but became a virtual machine bytecode. Those instructions no longer related to the CPU cores, but were a standard against which you could develop your hardware. We are seeing a decline in the number of instruction sets, but an increase in the implementations of that instruction set.
Maybe that's a contrast instead of a parallel. Anyway, I think diversity is difficult, and we have to look for ways to avoid it when we can - combining diversities leads to geometric (or worse) increases in difficulty. For instance: Linux handles a diverse set of hardware. God bless those device driver programmings, because I'm glad I have nothing to do with that. Instead the diversity is controlled via stable, generic interfaces to that hardware. There are many such stable, generic points in the system.
Coming back to that original paper, I'd say if you want to make an OS, maybe you should think hard about what "operating system" really means. Maybe you shouldn't be too caught up in what it has meant in the past. Make Linux your bare hardware. Maybe go even further, make some entire distribution your bare hardware. What have you got to lose, a few CPU cycles? You'd still have the fun part of building an OS - open development, no preconceptions (except for those you choose). You just have a richer base to build on, so you can get to the interesting stuff more quickly.
Debian linux with fbcon and alsa would make a good platform for new operating system development.
However, read up on the MIT exokernel before deciding that further systems research is a waste of time :)
Perhaps you miss that :)
Yes, I had Cleese in mind when I was thinking about this. It's cool -- but maybe the reason it is cool is because it gives an operating system with no preconceptions. I don't think it's cool because implemented device drivers in Python would be fun ;)
As for the Exokernel... I don't really know enough to say. There's been stuff like this for years, stuff people called revolutionary, but I'm not sure where the concrete results are. Linux is a huge step backwards to the monolithic kernel, yet it has been more successful than microkernels. Not just because people are dumb, either -- Microsoft has moved away from NT's microkernel design as well, and if they can't pull it off than microkernels aren't obviously the best way to do it. Maybe they are a good way to do it, but not Right Now, apparently.
But I only look at this stuff from the sidelines occassionally, so I'm just guessing.# Ian Bicking
exokernel != microkernel.
The thing is, many new operating system designs *are* revolutionary, but that does not make them commercial successes in the desktop market (witness NeXT, BeOS, etc.)
Revolutionary new operating systems have a greater chance of succeeding when they are developed for a new and relatively closed platform, like game consoles, mobile phones or other embedded systems.