Workshop #94: Round up some of the primary applications in your area of work and expertise. Your own products and those of your competitors (demo or trial software is fine). Create a matrix of applications and then set up criteria you would like to compare. Walk through all of the applications with an eye towards dispassionately seeing how they all stack up against each other (i.e. an application "shoot out"
. gather your findings and analyze them. Determine which applications come out on top in the various categories and write up executive summaries. Share these findings with your development/product team.
I'll be frank, Competitive Analysis is fun, if you approach it with the right mindset.
First and foremost, you have to make one simple commitment… set your personal biases, passions, and loyalties at the door. If you want to do competitive analysis, you have to be prepared for the fact that your product may not perform well.
Second, you need to construct workflows that are representative of the way that your company and your competitors products interact with their users. Get creative with this, and see how many of these workflows you can create. Work through ways that you can quantitatively and qualitatively represent the interactions.
Some examples can use benchmarking tools, or can use your automation tool of choice to run the system through hundreds or thousands of looped tests. How did your app do? How did the competitors app do? Can you record the details in such a way that, when you declare the winners/losers that you can do it based just on the numbers, and not on your knowledge of what's been entered? Note: this works best when you have a team that you have given random numbers and those numbers correspond with a platform that you, as the evaluator, don't know about.
Quantitative reviews are fairly simple; if the specs and numbers are faster for one app vs. the other, it'll be there in the numbers. Qualitative reviews are more tricky. How do you rate user experience? What feels good vs. feels not so good? In qualitative reviews, language needs to be precise, and it needs to be consistent. Since I am comparing many different examples, you want to make sure that the level of your review, and the language you are using, is applied fairly. Collect the data from these activities in a database or, if you want to be old school, use a spreadsheet. See my earlier post about dashboards; competitive analysis is a fun place to play around with dashboards, because I am looking at lots of interesting data points that can be fiddled with.
Doing competitive analysis is a great way to do a little bit of sleuthing and play detective with competitors products, but it has a marvelous knock-off effect. Looking at a whole range of products in a similar product space will quickly help you to become a domain expert, not just on the products, but on the business range as well. Back when I worked at Connectix, I did a lot of testing on all of the available virtualization options at the time, and by the time I got through with doing that, I learned a tremendous amount about how virtualization was being used and who was using it for what. The more examples and workflows you compare, and the more products you cover, the deeper the domain knowledge you get to develop. It's cool how that works
.
|