The quality of any given program (or subsystem) obeys to the following rule (due to Dijkstra):
Where Qp is the quality, Sk the skill of the coders, and |p| the size of the system.
The bottom line is: big programs suck. Beauty lies in minimal complexity.
Interestingly the size here is not measured up to an additive constant as in Kolmogorov's complexity, but it's the actual line of code count, so finding the right abstractions, or using the right languages, can help building more expressive systems keeping the same quality.
So this is also the golden rule for discarding generalizations: if an abstraction does not simpfy you code (your current code, in your current project), discard it.
Corollary: generalizations should be made a-posteriori (via refactoring) and never a-priori (via some kind of optimistic design that tries to guess what's going to be useful in the future).
Below a given quality, the system will just fail. So no matter what, there is a size limit for our projects that can't be crossed by human beings. It's intresting to note how this is a quite general rule, before computers, people were computers, and other people (mathematicians) programmed them by trying to analitically optimize as far as was possible. With integrated circuits we moved all our work-force in the "outer" shell, the programming one. Data-driven designs, generic programming, code generation, domain specific languages are all attemps to make a jump towards another meta-level.
P.S. Of course, this is something I've completely made up, I wanted to see how I could embed math in my posts...
Where Qp is the quality, Sk the skill of the coders, and |p| the size of the system.
The bottom line is: big programs suck. Beauty lies in minimal complexity.
Interestingly the size here is not measured up to an additive constant as in Kolmogorov's complexity, but it's the actual line of code count, so finding the right abstractions, or using the right languages, can help building more expressive systems keeping the same quality.
So this is also the golden rule for discarding generalizations: if an abstraction does not simpfy you code (your current code, in your current project), discard it.
Corollary: generalizations should be made a-posteriori (via refactoring) and never a-priori (via some kind of optimistic design that tries to guess what's going to be useful in the future).
Below a given quality, the system will just fail. So no matter what, there is a size limit for our projects that can't be crossed by human beings. It's intresting to note how this is a quite general rule, before computers, people were computers, and other people (mathematicians) programmed them by trying to analitically optimize as far as was possible. With integrated circuits we moved all our work-force in the "outer" shell, the programming one. Data-driven designs, generic programming, code generation, domain specific languages are all attemps to make a jump towards another meta-level.
P.S. Of course, this is something I've completely made up, I wanted to see how I could embed math in my posts...
No comments:
Post a Comment