Now some good news (possibly). Network Computing (NC) has announced plans to publish “rolling NAC product reviews” based on their comprehensive testing of NAC products. So why is this important (maybe)? Because NC has a relatively good reputation for evaluating technical network requirements and products. Does this naturally extend to NAC? Maybe. Maybe not. The jury (us) is still awaiting evidence and expert testimony.
First, some background. Every online “magazine” is currently trying - often desperately - to carve out a market position as a “major source” of information on NAC (network admission control and network access control). Network World (NW), for example, offers “NAC Cram Session”, a currently weak collection of content of uneven quality and timeliness stitched together under an awful name. And recently NC announced its new “NAC Immersion Center. So far, like the other publications this is largely a repackaging and re-branding exercise with a promise of better things to come.
But with NC we have some basis for expecting more. Mike Fratto is knowledgeable, well-intended and humble (I admit, this is secondhand knowledge) and NC does historically act according to a seemingly higher journalist standard than many other network publications. So there is a solid basis for hope. But optimism?
So what will it take for NC to earn its stripes in NAC coverage? With its upcoming NAC reviews it has created an opportunity to succeed and fail, and its readers should set a high standard for quality to judge how well NC performs. Here are the thoughts on what I would say to Mike if he cared to listen to my lone voice. I welcome yours.
NC needs to publish a detailed test plan so everyone understands what they are evaluating, why, what would satisfy/please them, and some idea of how important NC views each capability. The absence of this information severely weakened the recent Network World NAC product reviews. NC should avoid this amateurish mistake.
NC should review its test plans with its readers BEFORE publishing its test results and analysis so there is a better chance readers will appreciate and consider NC’s frame-of-reference BEFORE being distracted by NC’s judgments of specific NAC products.
I encourage NC to resist the “irresistible urge” to publish numerical scores as these are most often a disservice to vendors and potential buyers. Instead, please focus on spreading actual knowledge rather than scores.
Please provide readers with an in-depth view of your evaluation model so they can understand the variables and your weighting. Readers will then have the important opportunity to tailor your model to meet their our own needs and preferences. That would be a great service. In contrast, NW did this only at a macro level - which is meaningless.
I hope you have already sought beneficial input from vendors and respected security professionals BEFORE you defined your test plan. Knowing this and who they are can only increase the value of what you are doing and the credibility of the NC results.
(Added after writing the post, NAC Product Testing. Is there a better Way?) Your readers could learn a great deal more about individual products and AND ”products categories” AND their suitability for various situations if they could observe and participate in constructive discussions and debates about your tests and findings AFTER after you publish them. In this revised model for product evaluations, a forum where reviewers, vendors and your readers contribute their ideas becomes a major part of your product evaluation “service”. In one sense, NC becomes the instructor who successfully unleashes the incredible power of student knowledge. Yes, this would mean NC would need to rethink its product review model and create an effective new forum. But you can tap into key existing components: the latest web technologies, a huge pool of knowledgeable readers and their desire to be heard (questions and answers).
No comments:
Post a Comment