September 11, 2011

17 rules to design high quality software

Extracted from the online version of "The Art of Unix Programming", by Eric Raymond:
  1. Rule of Modularity: Write simple parts connected by clean interfaces.
  2. Rule of Clarity: Clarity is better than cleverness.
  3. Rule of Composition: Design programs to be connected to other programs.
  4. Rule of Separation: Separate policy from mechanism; separate interfaces from engines.
  5. Rule of Simplicity: Design for simplicity; add complexity only where you must.
  6. Rule of Parsimony: Write a big program only when it is clear by demonstration that nothing else will do.
  7. Rule of Transparency: Design for visibility to make inspection and debugging easier.
  8. Rule of Robustness: Robustness is the child of transparency and simplicity.
  9. Rule of Representation: Fold knowledge into data so program logic can be stupid and robust.
  10. Rule of Least Surprise: In interface design, always do the least surprising thing.
  11. Rule of Silence: When a program has nothing surprising to say, it should say nothing.
  12. Rule of Repair: When you must fail, fail noisily and as soon as possible.
  13. Rule of Economy: Programmer time is expensive; conserve it in preference to machine time.
  14. Rule of Generation: Avoid hand-hacking; write programs to write programs when you can.
  15. Rule of Optimization: Prototype before polishing. Get it working before you optimize it.
  16. Rule of Diversity: Distrust all claims for “one true way”.
  17. Rule of Extensibility: Design for the future, because it will be here sooner than you think.

September 8, 2011

A negative review automatically generated by the ICSERGen tool

The following review has been generated by the ICSERGen tool (by Crista Lopes) for the following paper:

Henrique Rocha; Marco Tulio Valente. How Annotations are Used in Java: An Empirical Study.  23rd International Conference on Software Engineering and Knowledge Engineering (SEKE), p. 426-431, 2011.

Have fun!

===================

The following review has been automatically generated by a program.
The goal is to make fun of certain reviews made by certain reviewers in certain conferences.
Do not use this in your real reviews.
Enjoy!

*=--=*=--=*=--=*=--=*=--=*=--=*=--=*=--=*=--=*=--=*=--=*

The major problem with this paper is that there is nothing new here. A lot of this has already been proposed before. Some examples that come to mind are: just to name a few.

The paper talks about "...textual search program to find annotation...", but I haven't seen any discussion on that (whether in the theoretical part of the paper nor in the validation part). It is just mentioned. For this kind of work, this is _relevant_ how an approach can deal with that.

Some of the well-known concepts have been just renamed: "system" is nothing else than "complex, interlacement, mesh"; "class" is "caste, estate, folk"; parts of the approach have not even gotten a name.

The paper uses the term "class." Class has been used in the context of AOP developed by Crista Lopes et al. The paper cannot use the same term and generate confusion (in many dimensions). A simple google search would have helped with the naming.

The techniques are explained at a rather shallow level. No details. So, for example, what's the precise definition of "annotation"? How is system related to annotation? The paper talks about method, but why is that important? The role of "code" in the approach is not clear. When the paper gets to a bit more detail on these things, it stops abruptly.

The paper does not provide enough details for the work to be reproducible.

The difference to Google annotation system facility is also not discussed.

I could not understand Fig 1; this kind of "visualization" is not effective (and also not intuitive).