Interesting article on why Mint won out over Wesabe….who I, being a Mint user myself for 3+ years, have never heard of, but found the ‘digital’ rise and fall interesting nonetheless.
A few two things I took out of this:
• Leverage the stuff people are already doing, and don’t recreate the wheel
• Reduce the interaction the user has to take, instead of a bias toward teaching the user
• Accuracy only has to be ‘okay’ for the general public to like it
• User Experience matters
http://money.cnn.com/2010/10/04/technology/wesabe_vs_mint/index.htm
Answers Beget Questions
a look at web analytics from the field
Monday, October 4, 2010
Wednesday, May 19, 2010
Live Mesh - the great promise of the cloud, realized
For anyone who hasnt checked out Live Mesh yet, I highly recommend it. After a simple setup process, you can add folders direcly on your local PC or Mac that look and feel *just* like the folders on your computer. You can then files, photos, music, whatever and sync them to the cloud. I've used Windows Live SkyDrive and although it offers more space, I really think Live Mesh is the way to go- it just 'acts' like your computer and you dont have to learn a new website.
Whats even better, you can install the Live Mesh app on your Windows Phone add folders on your Windows Phone and have it sync to your PC folders over the air- it also works with Mac. Super
Whats even better, you can install the Live Mesh app on your Windows Phone add folders on your Windows Phone and have it sync to your PC folders over the air- it also works with Mac. Super
Thursday, April 29, 2010
HP+Palm: its a tablet play
I read this article recently that mentioned how the HP acquisition of Palm spells trouble for Microsoft & Windows Phone.
I have to disagree. The article focuses on Kin, which is basically a rev’d up Sidekick with no app store. Danger was purchased late in the game and MS’s exec team couldn’t pull Kin & WP7 together in time, once MS decided in early Jan 2008 to switch directions on WP7 and go with the Dorado client. Ballmer has even said they will merge over the next couple years when they get their shit together ;).
Windows Phone is also making a larger play in the app world, and I think more developers will be attracted to it once they have a familiar toolset and how good the OS is. The xbox/XNA game piece also gives it an edge over competition, including Palm who has a fledgling app store right now.
Overall I think HP is making a tablet, rather than a phone, play here. They’ve got to compete with Dell who is repositioning themselves in this market, adopting Android as their tablet OS.
The iPhone OS scaled very well to the iPad and I can see HP doing the same with the WebOS. The HP Slate is a good device although bulky, and Windows 7 doesn’t scale well on touch devices yet. Microsoft is already working on comparable OS’s and I think once they nail Windows 7 on tablets they’ll be a leader in the space (take Courier for example).
HP will still continue to produce PC’s with Windows 7 on them, but the future is tablets.
I have to disagree. The article focuses on Kin, which is basically a rev’d up Sidekick with no app store. Danger was purchased late in the game and MS’s exec team couldn’t pull Kin & WP7 together in time, once MS decided in early Jan 2008 to switch directions on WP7 and go with the Dorado client. Ballmer has even said they will merge over the next couple years when they get their shit together ;).
Windows Phone is also making a larger play in the app world, and I think more developers will be attracted to it once they have a familiar toolset and how good the OS is. The xbox/XNA game piece also gives it an edge over competition, including Palm who has a fledgling app store right now.
Overall I think HP is making a tablet, rather than a phone, play here. They’ve got to compete with Dell who is repositioning themselves in this market, adopting Android as their tablet OS.
The iPhone OS scaled very well to the iPad and I can see HP doing the same with the WebOS. The HP Slate is a good device although bulky, and Windows 7 doesn’t scale well on touch devices yet. Microsoft is already working on comparable OS’s and I think once they nail Windows 7 on tablets they’ll be a leader in the space (take Courier for example).
HP will still continue to produce PC’s with Windows 7 on them, but the future is tablets.
Friday, September 12, 2008
Think your browsing activity doesnt matter? Try $150MM
Monday was a dismal day for United Airlines (UAL) as their stock slid 75% into the depths of despair. The sky was falling, as news was released they are going bankrupt (again!)...
Its not surprising, with oil prices, our economy, and this looming (will it ever really be one) recession. Too bad it was a mistake.
wait wait, what was that? no really, it was all just a mistake. Google's crawler picked up a news story from a Tribune news site, and posted it to their headlines. This spread through the internet, and then spilled over to Wall Street where their stock took a dive. The Wall Street Journal lays out the events in this article, its worth a read: http://online.wsj.com/article/SB122109238502221651.html
NASDAQ eventually stopped trading and all was thwarted. Unfortunately, their stock is still off about 9% from where they were the Friday before. A single click on an old article cost the company (and its shareholders) nearly $150MM.
How can this happen? Much of the market is now driven based on automated trading, and things called ALGOS (algorithm's) which constantly scan the web, news, and other information and make automated decisions to buy or sell based on market news; so presumably you can get an edge on the other guy. In a tumble like this however, with such a big company, it was cavalcade of failure as all of these ALGOS started to sell, then more sold because the other ALGOS sold, and so on. A good explanation of it is here: http://www.reuters.com/article/reutersEdge/idUSN1039166420080910?pageNumber=2&virtualBrandChannel=0
This will become even more prevalent in the future as we automate our market to search articles like this, which could be disastrous for some people. The SEC is investigating the whole deal to see what can be done to prevent slides like this in the future. ALGOS are great to have but could end up biting us even more, not sure if I have an opinion yet on where we need to take this but I may end up re-posting with more later. Something to chew on for the day...
Its not surprising, with oil prices, our economy, and this looming (will it ever really be one) recession. Too bad it was a mistake.
wait wait, what was that? no really, it was all just a mistake. Google's crawler picked up a news story from a Tribune news site, and posted it to their headlines. This spread through the internet, and then spilled over to Wall Street where their stock took a dive. The Wall Street Journal lays out the events in this article, its worth a read: http://online.wsj.com/article/SB122109238502221651.html
NASDAQ eventually stopped trading and all was thwarted. Unfortunately, their stock is still off about 9% from where they were the Friday before. A single click on an old article cost the company (and its shareholders) nearly $150MM.
How can this happen? Much of the market is now driven based on automated trading, and things called ALGOS (algorithm's) which constantly scan the web, news, and other information and make automated decisions to buy or sell based on market news; so presumably you can get an edge on the other guy. In a tumble like this however, with such a big company, it was cavalcade of failure as all of these ALGOS started to sell, then more sold because the other ALGOS sold, and so on. A good explanation of it is here: http://www.reuters.com/article/reutersEdge/idUSN1039166420080910?pageNumber=2&virtualBrandChannel=0
This will become even more prevalent in the future as we automate our market to search articles like this, which could be disastrous for some people. The SEC is investigating the whole deal to see what can be done to prevent slides like this in the future. ALGOS are great to have but could end up biting us even more, not sure if I have an opinion yet on where we need to take this but I may end up re-posting with more later. Something to chew on for the day...
Wednesday, July 23, 2008
WAA Championship - winners without scorecards!
I just spent some time reading a good post from Avinash about the WAA Championship (http://www.webanalyticsassociation.org/wachampionship/) and the results are in. Sadly my comapny was not in the top 4, but this gives us a great opportunity to explore new things and see this in a different light.
Avinash brings up some good points I know we’re all striving for in our analysis: how to make it better and more actionable, without giving a data flood to our clients. He also gives 7 tips on how to improve the quality of analysis that I know we all think about but may struggle to put into action.
Here is a link to the article, I highly suggest reading it as well as the winning entries: http://www.kaushik.net/avinash/2008/07/consultants-analysts-present-impactful-analysis-insightful-reports.html
Notice a common theme among all of them… no scorecards! This might be a good approach if we’re looking to break out of the doldrums of scorecard data smog and really drive clients business forward. I often ask myself should we ditch Excel entirely for client facing documents. The more we allow excel to control our work, the more data-laden things can get. It’s the whole goldfish in a bowl thing…if we can start to change our own culture first, we can change our clients and push them forward!
Let me know what you think, I’m eager to hear your feedback.
Avinash brings up some good points I know we’re all striving for in our analysis: how to make it better and more actionable, without giving a data flood to our clients. He also gives 7 tips on how to improve the quality of analysis that I know we all think about but may struggle to put into action.
Here is a link to the article, I highly suggest reading it as well as the winning entries: http://www.kaushik.net/avinash/2008/07/consultants-analysts-present-impactful-analysis-insightful-reports.html
Notice a common theme among all of them… no scorecards! This might be a good approach if we’re looking to break out of the doldrums of scorecard data smog and really drive clients business forward. I often ask myself should we ditch Excel entirely for client facing documents. The more we allow excel to control our work, the more data-laden things can get. It’s the whole goldfish in a bowl thing…if we can start to change our own culture first, we can change our clients and push them forward!
Let me know what you think, I’m eager to hear your feedback.
Tuesday, February 12, 2008
Leaky Pages
I was asked recently by a client to do a deep-dive on their Home Page click-through rates. As I was pulling the information I noticed that only about 20% of all site visits enter on the Homepage, and that only 25% of all visits ever see the Homepage. This was alarming as I know they were very concerned with optimizing and making changes to the Homepage, as they just did a refresh a few months ago and are finally grasping the idea (and power) optimization, supported by sound analytics, can have on click-through rates.
The project bubbled over into an Entry Page Deep-Dive where I discovered that 68% of the traffic entered on 20 pages. For these pages I then pulled in their total visits, exits, and single page visits. Based on these 4 data points, and using some simple standard deviation, and median averages, I was able to pinpoint what pages out of these 20 were the best candidates for improvement. I ended up calling these 'Leaky Pages' because they were the ones getting the most eyeballs first (thus giving the highest amount of 'first impressions') but at the same time, also losing a lot of the traffic via exits, or bounces. A lot of these pages were downloads that people are searching on, and of course the goal of SEO is to get people exactly what they are looking for so there is a win there. At the same time however, these pages werent a big enough draw to keep the visitors, basically a good chunk of traffic was coming to the site for a specific download, getting it, then leaving.
By paring the 20 pages down to just the critical few, it gave the client something to work on immediately, rather than making sweeping changes to the entire site. If they make a few optimizations on these pages, they can surely drive people into the site more often, and keep them on the site longer. It also took the attention away from making immediate optimization changes to the Homepage because, if I've done my job correctly, they will see there is less value in optimizing for an increase of 1,000 clicks, vs losing 100k visits in a month due to a Thank You page that isnt engaging.
Check these two charts out, the first identifies the top 20 entry pages, and the second identifies which pages have a bouunce rate 1 standard deviation above the median average page-bounce rate, and which have a lower click-through rate than 1 standard deviation below the median click-through rate. Its a pretty simple way to see exactly where to focus your efforts.
The project bubbled over into an Entry Page Deep-Dive where I discovered that 68% of the traffic entered on 20 pages. For these pages I then pulled in their total visits, exits, and single page visits. Based on these 4 data points, and using some simple standard deviation, and median averages, I was able to pinpoint what pages out of these 20 were the best candidates for improvement. I ended up calling these 'Leaky Pages' because they were the ones getting the most eyeballs first (thus giving the highest amount of 'first impressions') but at the same time, also losing a lot of the traffic via exits, or bounces. A lot of these pages were downloads that people are searching on, and of course the goal of SEO is to get people exactly what they are looking for so there is a win there. At the same time however, these pages werent a big enough draw to keep the visitors, basically a good chunk of traffic was coming to the site for a specific download, getting it, then leaving.
By paring the 20 pages down to just the critical few, it gave the client something to work on immediately, rather than making sweeping changes to the entire site. If they make a few optimizations on these pages, they can surely drive people into the site more often, and keep them on the site longer. It also took the attention away from making immediate optimization changes to the Homepage because, if I've done my job correctly, they will see there is less value in optimizing for an increase of 1,000 clicks, vs losing 100k visits in a month due to a Thank You page that isnt engaging.
Check these two charts out, the first identifies the top 20 entry pages, and the second identifies which pages have a bouunce rate 1 standard deviation above the median average page-bounce rate, and which have a lower click-through rate than 1 standard deviation below the median click-through rate. Its a pretty simple way to see exactly where to focus your efforts.
Friday, January 18, 2008
Page load time and page references
Issues with your site or page loading? High Bounce Rates? Think you need better content? Maybe its the page itself.
This is a helpful tool when troubleshooting pages that load slowly. You can see if images are too big, or if certain css or script files are too big, etc. You can also see if a page is referencing images or script that are not in HTTPS format – which causes that annoying security alert.
And it's free!: http://www.websiteoptimization.com/services/analyze/
I've tried it a few times on some of my sites, and have been able to cook it into some good analysis which leads to checking under the hood, rather than bringing in a copywriter. Check it out, see what you think...
Subscribe to:
Posts (Atom)