Heatmap updates: Longer storage, better scaling, and new filters

We just pushed some updates to heatmaps today.

First, we've extended the storage time to 6 months, up from 3. When we released these about 15 months ago, we were worried that they would take up too much space if we kept them around for too long, so we were limiting to 3 months of data. It took a while before "lots" of people were using them, so we couldn't analyze the impact on storage space fairly reliably until recently. Good news, they take up less space than we had anticipated because we made some good decisions on the design side of things. Of course, the data older than 3 months from right now has already been purged, so you don't have 6 months now. But 3 months from now, you will.

Second, we added a few new filter options when using heatmaps via the on site analytics widget. The new filters are under the "More..." menu, as shown below. These new filters allow you to view heatmaps for a page for just new visitors, returning visitors, visitors online now, or "registered" visitors. Registered means they have a "username" custom data field attached to them.

We'll probably add a few other filters as we think of them. What would you like to see here? One thing we considered was some kind of engagement filter, e.g. only visitors who bounced or only visitors who did NOT bounce. However, if you think about it, almost everyone who bounces would not have clicked anything on your site, so it wouldn't really tell you much.




Last, we changed the scaling of the heatmaps. The problem was that for pages with lots of clicks, there would generally be a few extremely "hot" areas, e.g. in the navigation bar, and they would be so much hotter than the rest of the page, the other clicks would not even be visible. We've changed it so the scale of hotness now only goes from 1 to 10, rather than 1 to whatever it would happen to be as stored in the database.

Here is an example of a heatmap for our homepage using the old method, for "all clicks":



Most visitors coming to our site are already customers, so the two login links are going to be the most clicked items, by about a thousand miles. Of course, there are plenty of other clicks on this page (e.g. new people signing up are going to click the big ass "register" button), but we can't see them unless we apply some filters first, such as only showing visitors who completed our "new user" goal. It would be real nice to be able to see ALL the clicks on this page though, wouldn't it?

So here's what the new scaling method does to the exact same pool of data in the last screenshot, most of which was completely invisible:



It's certainly noisier but the hot areas still stand out, and yet you can see every single click on the page, because the range from min to max is so much smaller.

Let us know what you think!
3 comments |   Jan 09 2014 3:21pm

Twitter direct messages

Four and a half years ago we released several Twitter features, one of those being to receive alerts via direct message on Twitter. This has been an awesome feature and we have used it ourselves since day one, to be notified of certain goals happening on our site, such as a new paying customer. Earlier this year we released our uptime monitoring feature, and we added Twitter direct message alerts there also.

So we are sad to say that today, these features have been removed, and it's unlikely they will ever come back.

Sometime in mid-October, Twitter added some pretty strict throttling. We never saw any "news" about this, but within a few days we started getting tweets and emails about Twitter alerts no longer working. A bit of research led us to this page on Twitter's web site, which says that all accounts are now limited to sending 250 direct messages per day, whether via the API or not.

We send many many thousands of direct messages a day, hence this feature is now severely broken. We blow through 250 direct messages by 1 AM most days. So unless Twitter revises this limit to something much higher, this feature is permanently dead.

Many other services have probably been affected by this throttle. It just really bothers me how much Twitter keeps slapping third party developers in the face, the very people who helped build Twitter into what it is today. One thing I know, we will never add another Twitter feature to Clicky, ever. Our Twitter analytics feature still works, thankfully. For now, anyways...
11 comments |   Nov 11 2013 3:06pm

Sticky data: Custom data, referrers, and campaigns saved in cookies

[Do not panic. You do not need to update your tracking code.]

We've just pushed a major update to our tracking code so it works a bit more like one aspect of Google Analytics now, that being that some additional data is saved in first party cookies (set with Javascript) for visitors to your site. This data being referrers, dynamic (UTM) campaign variables, and custom data set with clicky_custom.visitor (renamed from clicky_custom.session - don't worry, the old name will still work indefinitely).

We're calling this "sticky data" and the point of it is two fold. First, for referrers and dynamic campaigns, this will better attribute how your visitors originally arrived at your site, as they visit in the future. In other words, if they find your site via a link or search, all future visits to your site where they go directly to your site instead of through a link, this original referrer (and/or campaign) will be attached to these new sessions. This will be particularly useful for those of you who have setup goal funnels using referrers or campaigns. These cookies are set for 90 days. Google does 180, which we think is a bit too long, so we're doing 90 instead.

(Note: This does not work for "static" (pre-defined) campaigns that you create in Clicky's interface. It only works with the dynamic ones created with e.g. "utm_campaign" etc variables).

Second, if you set custom visitor data, we've thought for a while now how great it would be if that data stuck across visits. For example if someone logs in and you attach their username to their session, that's great - everytime they login that is. But what about when they visit your site in the future but don't login? Well, now that we save this data in a cookie, their username will still get attached to their session so you'll still know who they are. utm_custom data will also be saved! These cookies are set "indefinitely", more or less.

A lot of you use custom visitor data to attach things that are very session specific though, such as shopping card IDs, that kind of thing. With this in mind, there are only 3 specific keys we'll save by default for custom visitor data. Those keys are "username", "name", and "email". Of course, if you have others you want to save in cookies, you can customize it with the new visitor_keys_cookie option. Click that link to learn more.

We think the vast majority of you will like this new sticky data. However, if for some reason you don't, we created another new clicky_custom option as well, sticky_data_disable. Setting this option will disable this data being saved to or read from cookies, without having to fully disable cookies. And of course, if you have fully disabled cookies, this data will never get saved in the first place.

Originally we wanted to add support for parsing GA's "__utmz" cookies, which is what GA uses to store campaign and referrer data for 6 months. The cookie format is fairly straight forward but upon investigating our own GA cookies we saw a lot of inconsistency across all the sites that it had been set for. So we're going to hold off on that for now.

Our privacy section on cookies has been updated to reflect these updates.

Enjoy!
5 comments |   Sep 27 2013 1:27pm

New features: Site domains, and ignore pages by URL

We just pushed two new features today.

The first is site domains, where we break down the traffic on your site by the domain name each page view was on. This will only be interesting if your site has multiple sub-domains, or you are tracking more than one root domain under a single site ID - but we know that there are a lot of you out there that this will apply to, as it has been requested a number of times.

This new report is under Content -> Domains. Note that we are only logging the domain name for each page view, to give you a general idea of where your traffic is going. You can't filter other reports by this data.

A commonly requested and related feature is breaking down traffic by directory. We will be adding this in the future.




The second feature is the ability to ignore page views based on their URL. This has been requested by many customers!

This can be setup in your site preferences, as explained in this new knowledge base article. You can enter in one or more patterns to match and all page views that match those patterns will be ignored. Side effects: 1) If a visitor only views pages that match your filters, the visitor will not be logged at all. 2) If a visitor lands on one of these filtered pages but then views other pages, some data only available on their first page view, such as referrer, will not be available.

The help document linked above also explains how to set these up. We support wildcards but only on the end of a pattern, for example. Read it for full details.

We get requests to ignore traffic based on other things too, such as country, referrer, and organization. Some or all of these will likely be added in a future update, and when we do that we'll probably just create an entirely new preferences section just for setting up things you don't want to log, instead of piling more options onto the main site preferences page.
3 comments |   Aug 06 2013 7:31pm

The best return on investment we've made

Clicky is almost 7 years old and we've always been a very small team. We've handled all of the email ourselves since the beginning, using gmail, which we like. I doubt there's a single human being that enjoys doing tech support, but it's critical to customer satisfaction. It's also one of those things that quickly spirals out of control if you're not on top of it every single day, and it takes away precious time from what we really want do (write code).

At our peak we were getting almost 50 emails a day. Now we're down to about 20. How did we do that? Black magic? No. What we did was build a knowledge base with hundreds of topics (over 300 and counting), including many guides/howto's and tons of the most common problems people have.

Aside from hoping to get less emails, we really just wanted thorough guides and articles with screenshots to make things as easy as possible to understand for new and old customers alike, instead of having to type the same email 10 times a day when someone asks "where's my tracking code?". When a customer reads an article we link them to, they're now aware of the knowledge base if they weren't already, and will hopefully turn there first in the future when in need of help.

We released this silently back in March and the effect was immediately noticeable. Email volume dropped in half almost overnight, and has slowly declined a bit more since then as more people become aware of it. It also helps that the contact page is now only available as a link from the knowledge base page. We're not trying to hide our contact info like some companies do, rather we just want people to see the knowledge base first.

It was a solid two weeks of work coming up with a list of articles to write, categorizing them into a tree, cross-linking them, and writing them all out with screenshots etc. It was an extremely boring and repetitive two weeks, but let me tell you: it was worth every dreadful second.

It's been especially insightful using our heatmap tracking to see what items people are most interested in on each page, and how many people are exploring the knowledge base rather than clicking the link to contact us.




The only thing missing it from it was a search form. That's part of the reason for today's post, to announce that we finally added a search form to the page. The reason it didn't have one before was because I was going to write a custom search with customized weighting and things like that. But out of the blue yesterday, it was suggested to me that I just use Google's search for it. Not a bad idea, I thought. At the very least it's better than a kick in the pants. It only took about 5 minutes of work to do that, which was great, but also made me sad I didn't think of this before.

Anyways, when you do a search now, it redirects you to google.com with "site:clicky.com inurl:/help/" appended onto the end of your search, so the results are only from our help section (which is mainly the knowledge base). Yeah, I could do their embedded search thing, but I don't like that.

I was not planning to leave it as a Google search forever, but then I thought about the thousands of man-years of experience that Google has building a search engine, and frankly they're pretty dang good at it. With things like automatic spelling correction, accounting for word varations, and semantic analysis, it works really well. I tested tons of searches (to the point of getting banned because they thought I was a robot) and everything I threw at it worked great, including the order of the search results. I don't like that it takes people off our site, but for now I don't see a better option.

The real point of this post is just to say that building a knowledge base has been the best return on investment we ever made. We easily save 1-2 hours every single day because our email load has dropped so much, and I imagine with the new search functionality, it will drop even more.

Other amazing returns for us have been virtualizing all of our servers, which we wrapped up last year, and automated deployment to all production servers when we push new code to our main web server with Git. These save us huge amounts of time, and time is money my friends. Virtualization was probably the biggest project we've ever done, taking about a full year, so the investment was gigantic. But everytime I deploy a new server with a single command, I am one happy panda.

What's the best ROI your company has had?
3 comments |   Jul 30 2013 5:38pm

Next Page »




Copyright © 2014, Roxr Software Ltd     Blog home   |   Clicky home   |   RSS