So, it has been virtually impossible for me to do any blogging lately because I have been working a lot of hours on my newest project. The discovery phase with this particular client was incredibly lengthy and laborious...they are government agency which operates a major international airport, two regional airports, a bridge and a commercial seaport. Each business unit will have newly designed web sites fed content by a new, central Sharepoint CMS.
The dollar value on this project is roughly 4 times the size of the largest project I had managed up to this point and every detail is...quite simply...bigger. There are 130 individual design templates handling content from two dozen uniquely structured content objects.
Although the work has been incredibly challenging the project is very cool. We are getting the opportunity to design and develop some very innovative online tools for travelers. Because of the project's size we have the opportunity to create a breadth and variety of project deliverables not usually included in smaller web projects. and this has allowed us to develop our internal capabilities immensely.
One of the new deliverables we developed was a 150 page comparative study and gap analysis. Normally when I do something new I will look for examples of the finished product to help guide my thinking while I develop a plan for the work package. In this case, it was very difficult to find any good examples. It seemed as though very few people write comparatives on web sites. My theory is that this type of study is difficult to execute across a body of web sites, even in vertical markets, because they are so rarely 'like objects'.
My basic approach to the benchmarking study and comparative analysis was to first go out and create an inventory of every web site I could find in each of the client's business categories. In my current client's case this ended up being a list of 85 web sites. I then devised a method for a large group of people to run through all the sites and give them a rating in each of the following categories: Design, Features, Information Architecture, SEO, Code Compliance and Overall Execution.
The groups were instructed to go through the entire group as quickly as possible...spending no more than 2 minutes on each site. The goal here was to get some averages, to narrow the field and not do a final analysis in this early step. After getting through 20 individual reviewers we were able to see some definite patterns and so we selected the top 4 web sites in each category.
The next step was to confirm our assumptions so far with the client. They agreed with our selections and selection process but asked us to add a handful of their known favorites, which we did.
Next we conducted a detailed, analysis of each of the leading web sites on our list. This ended up being a dozen web sites across 3 business categories. The detailed analysis had two main components. First, we designed a much more detailed method for conducting a survey to assign ratings to the sites in a number of new categories. Second we broke out all the individual features in our comparative matrix and analyzed them.
Finally, we conducted an editorial review of each site, the clients current site and planned site.
In the end our document was approximately 150 pages which the client was very happy with. The document included detailed descriptions of our methodology, rational, assumptions and parameters. The document also featured a lot of screen shots, images, charts and diagrams to augment the lengthy language required.
OK, Q-Tip is on Dave Letterman right now so I need to stop working and zone out...