Where does that leave Plone? Well, for one thing the Plone listing on CMS Matrix was last updated on Sept. 8, 2007. I'm guessing that there's a fair piece of updating to be done. Aye, but who's to do it?
Meanwhile Netvolution is rated by users as best in system requirements and support. But a closer look shows that Netvolution's rating is based upon a single 10 point rating. Looks like CMS Matrix has some work to do before one can trust their numbers without digging deeper. To their credit, they let you dig deeper...
Heres where Plone stands:
Category | Rating | Count | Ahead |
System Requirements | 5.77 | 95 | 7 |
Security | 6.62 | 97 | 2 |
Support | 6.23 | 97 | 4 |
Ease of Use | 6.65 | 95 | 2 |
Performance | 5.99 | 97 | 6 |
Management | 6.63 | 97 | 0 |
Interoperability | 6.54 | 97 | 0 |
Flexibility | 7.04 | 95 | 0 |
Built-in Applications | 6.54 | 97 | 0 |
Commerce | 5.07 | 97 | 8 |
Rating is the average score of the user-respondents. Count is the number of respondents or reviewers. Ahead is my column that shows the number of CMS that scored higher than Plone and which had a higher number of respondents. Plone does very well by this metric, what with so many CMS having only 1 or 2 reviewers and these probably very biased.
Clearly the more people who rate a system the lower the average score in a general sort of way. Two hundred reviewers won't all be so brazen as to rate the same feature a '10'. What are the chances that any CMS will meet 100% of their requirements?
CMS Matrix may be useful for feature comparisons, but it has some serious flaws. For example, how does Plone (6.62) rate lower than Drupal (6.69) in Security when the Mitre CVE listing shows Plone far, far more secure?
The bottom line? Don't believe what you read, at least not without a little critical thinking and understanding the limitations of a given method.