Forums » Suggestions & Feature requests



Dealing with [secure search]

The vast, vast majority of my Google search traffic is now "[secure search]". Therefore, I have to give up on the old way of looking at things - which keyphrases Google is sending traffic to - and focus on which PAGES they're sending it to. But something is not adding up, unless I'm missing something. Example:

My top performing page used to get 20,000 visits for "key phrase". Now it only gets 2,000, but I have 80,000 "[secure search]", many of which probably used that keyphrase to get to that page. So I filter to see how many people came to that page via Google by using the "domain" filter and selecting "Google" and I get something like 2170.

That can't be right. I'm still getting nearly as many Google searches per day as I ever did, so it's really not likely my top page has dropped that much and the others are making it up behind "secure search."

How can I determine precisely how many people Google sent to a particular page? Also, same thing with Facebook - you can't generally see what FB status or page linked to you, but I would like to be sure that when Clicky tells me how MANY people it sent to a specific page, I know it's accurate.

Am I missing something, or should this become a bug report?

Posted Tue Sep 24 2013 2:51p by merono***


Have you selected Content -> Entrance ? Because that are the pages where the People that come from Google land.

Besides of that I agree that it is impotant for the future to find a way to deal with the [secure search] thing and get as much information as possible. As you already said, the entrance pages of our sites become an important factor.

If you look here

http://moz.com/blog/decoding-googles-referral-string-or-how-i-survived-secure-search

you can see, how you can get at least SOME information out of the Google referer string.

If you combine page entrance data with search ranking data you may get further information.

I think it is important your us (and for clicky!) to find creative ways to get the most out of the situtaion as it is. I think it will be a key feature of any web analytics service in the future to "describe" or "guess" the Google traffic for the site owners in the best way possible.

Posted Wed Sep 25 2013 12:33a by pold***


Thanks, poldi, that article is really useful. I checked Content > Entrance > [my top page] > Filter > Domain > Google and got the same unlikely low number as in my earlier example. In August, I got 27,000 page views, but it's telling me less than 2000 came from Google. And if I filter for Traffic Source > Search, it tells me 65,000 searches landed on the page. Something is not adding up.

In the Adwords Keywords Tool, the search volume for my top keyphrase is around X, but Clicky is telling me even though I'm #1 for that phrase, I'm only getting around 1/10th of X visits from Google on that page. Now, I should mention others are seeing a 90% difference between what the Keyword Tool says the term gets and what their #1 page for it is getting. It could be other analytics tools are counting things the same way as Clicky (which might not be "wrong" but rather looking at the data differently than I expect, although I can't guess how). But it could be Google's doing something very strange - they've done a lot of that in the last couple of years.

If Clicky can be made to decode anything further from the referral string, that would be awesome. For now, I would settle for a simple way to count precisely how many people landed on a particular page via Google (or Facebook, or Bing, etc.). If you use "traffic source > search" it's not broken down by engine, and if you use "domain > google.com" or "facebook.com" or whatever, the numbers seem really low. I suspect Clicky has a reason for doing it this way, but I don't know what it is and it's not helpful. I just want to know what amount of visitors landed on a page from Google.com. And I could be wrong, but I don't think any of the filters are actually designed to do that.

Posted Wed Sep 25 2013 7:35a by merono***


I thought this an interesting article on that very subject.
http://www.wordtracker.com/blog/hey-google-why-did-you-turn-the-lights-out

Posted Fri Sep 27 2013 8:31a by karenra***


Thanks, karenratte, VERY interesting.

I've done some research about how Analytics tracks this, and I'm thinking when you filter "domain > google.com", Clicky may be tracking referrals from Google products - the forums, the Google+ pages, etc. That would explain the much lower than expected numbers.

If so, there's just no way to parse the data to see how many visitors you're getting from Google to a particular page.

Posted Fri Sep 27 2013 8:41a by merono***


As has been pointed out elsewhere, you can access search information via Google Webmaster Tools that at this time simply isn't available to Clicky. While the results are only aggregates (i.e. the top search terms and how many people used them). Clearly Google is making this information available to itself under the hood, which Clicky can't generate itself.

This is a bit of a problem, and one that it's easy to think is a below-the-belt way of causing problems for competing companies. Is there any way Clicky can get and use these aggregates, as the traditional way is going to be more or less redundant now that Google has officially acknowledged that it's rolling out privatised searching to everybody.

http://blog.hubspot.com/marketing/google-encrypting-all-searches-nj

Posted Fri Nov 1 2013 6:22a by ryanwilli***


You must be logged in to your account to post!