Friday, September 12, 2008

Think your browsing activity doesnt matter? Try $150MM

Monday was a dismal day for United Airlines (UAL) as their stock slid 75% into the depths of despair. The sky was falling, as news was released they are going bankrupt (again!)...

Its not surprising, with oil prices, our economy, and this looming (will it ever really be one) recession. Too bad it was a mistake.

wait wait, what was that? no really, it was all just a mistake. Google's crawler picked up a news story from a Tribune news site, and posted it to their headlines. This spread through the internet, and then spilled over to Wall Street where their stock took a dive. The Wall Street Journal lays out the events in this article, its worth a read: http://online.wsj.com/article/SB122109238502221651.html

NASDAQ eventually stopped trading and all was thwarted. Unfortunately, their stock is still off about 9% from where they were the Friday before. A single click on an old article cost the company (and its shareholders) nearly $150MM.

How can this happen? Much of the market is now driven based on automated trading, and things called ALGOS (algorithm's) which constantly scan the web, news, and other information and make automated decisions to buy or sell based on market news; so presumably you can get an edge on the other guy. In a tumble like this however, with such a big company, it was cavalcade of failure as all of these ALGOS started to sell, then more sold because the other ALGOS sold, and so on. A good explanation of it is here: http://www.reuters.com/article/reutersEdge/idUSN1039166420080910?pageNumber=2&virtualBrandChannel=0

This will become even more prevalent in the future as we automate our market to search articles like this, which could be disastrous for some people. The SEC is investigating the whole deal to see what can be done to prevent slides like this in the future. ALGOS are great to have but could end up biting us even more, not sure if I have an opinion yet on where we need to take this but I may end up re-posting with more later. Something to chew on for the day...

Wednesday, July 23, 2008

WAA Championship - winners without scorecards!

I just spent some time reading a good post from Avinash about the WAA Championship (http://www.webanalyticsassociation.org/wachampionship/) and the results are in. Sadly my comapny was not in the top 4, but this gives us a great opportunity to explore new things and see this in a different light.

Avinash brings up some good points I know we’re all striving for in our analysis: how to make it better and more actionable, without giving a data flood to our clients. He also gives 7 tips on how to improve the quality of analysis that I know we all think about but may struggle to put into action.

Here is a link to the article, I highly suggest reading it as well as the winning entries: http://www.kaushik.net/avinash/2008/07/consultants-analysts-present-impactful-analysis-insightful-reports.html

Notice a common theme among all of them… no scorecards! This might be a good approach if we’re looking to break out of the doldrums of scorecard data smog and really drive clients business forward. I often ask myself should we ditch Excel entirely for client facing documents. The more we allow excel to control our work, the more data-laden things can get. It’s the whole goldfish in a bowl thing…if we can start to change our own culture first, we can change our clients and push them forward!


Let me know what you think, I’m eager to hear your feedback.

Tuesday, February 12, 2008

Leaky Pages

I was asked recently by a client to do a deep-dive on their Home Page click-through rates. As I was pulling the information I noticed that only about 20% of all site visits enter on the Homepage, and that only 25% of all visits ever see the Homepage. This was alarming as I know they were very concerned with optimizing and making changes to the Homepage, as they just did a refresh a few months ago and are finally grasping the idea (and power) optimization, supported by sound analytics, can have on click-through rates.


The project bubbled over into an Entry Page Deep-Dive where I discovered that 68% of the traffic entered on 20 pages. For these pages I then pulled in their total visits, exits, and single page visits. Based on these 4 data points, and using some simple standard deviation, and median averages, I was able to pinpoint what pages out of these 20 were the best candidates for improvement. I ended up calling these 'Leaky Pages' because they were the ones getting the most eyeballs first (thus giving the highest amount of 'first impressions') but at the same time, also losing a lot of the traffic via exits, or bounces. A lot of these pages were downloads that people are searching on, and of course the goal of SEO is to get people exactly what they are looking for so there is a win there. At the same time however, these pages werent a big enough draw to keep the visitors, basically a good chunk of traffic was coming to the site for a specific download, getting it, then leaving.


By paring the 20 pages down to just the critical few, it gave the client something to work on immediately, rather than making sweeping changes to the entire site. If they make a few optimizations on these pages, they can surely drive people into the site more often, and keep them on the site longer. It also took the attention away from making immediate optimization changes to the Homepage because, if I've done my job correctly, they will see there is less value in optimizing for an increase of 1,000 clicks, vs losing 100k visits in a month due to a Thank You page that isnt engaging.


Check these two charts out, the first identifies the top 20 entry pages, and the second identifies which pages have a bouunce rate 1 standard deviation above the median average page-bounce rate, and which have a lower click-through rate than 1 standard deviation below the median click-through rate. Its a pretty simple way to see exactly where to focus your efforts.


Friday, January 18, 2008

Page load time and page references

Issues with your site or page loading? High Bounce Rates? Think you need better content? Maybe its the page itself.

This is a helpful tool when troubleshooting pages that load slowly. You can see if images are too big, or if certain css or script files are too big, etc. You can also see if a page is referencing images or script that are not in HTTPS format – which causes that annoying security alert.

And it's free!: http://www.websiteoptimization.com/services/analyze/

I've tried it a few times on some of my sites, and have been able to cook it into some good analysis which leads to checking under the hood, rather than bringing in a copywriter. Check it out, see what you think...