Extreme database maintenance this weekend

We typically only do maintenance once every couple of months and only to one or two servers at a time, but this weekend will be pretty ridiculous.

The database servers listed below will halt traffic processing for up to 10 hours this weekend while we perform some necessary maintenance, starting either Friday night (if I get it ready in time), or Saturday morning. (This is US pacific time, GMT -7). When the maintenance is done, they will all have a large backlog of traffic to process before they are caught up with real time again.

(Spy will still working during this time, because it is an actual live stream of data as it comes into our tracking servers, rather than using the database. So if you have a paid account you can still keep up with basic stats via Spy during the maintenance.)

These are mostly older servers so most sites registered within the last year or so should be unaffected. The exceptions are db27 and db33 which are less than a year old. These servers have quite a bit less data than the other ones though so their maintenance window should be quite a bit shorter.

To determine which database server a site is hosted on, take a look at the preferences page for that site.

Affected database servers:
1 - 11, 15, 17, 18, 20, 21, 23, 27, 33
10 comments |   Jul 15 2011 10:50am

No more Flash!

We have converted to a native graphing library (Highcharts) that works on almost every platform, including iPhone and iPad. It loads faster and it looks good. The graphs look and feel the same as the old ones, so other than the fact that they will now work on your non-Flash devices, you can barely tell the difference:




Bonus: this will finally allow us to send PDF email reports. In fact, those have been under development for about a week now, alongside this. They're almost done... we were planning to release them at the same time but a few last minute issues dashed those hopes. Tomorrow, perhaps.

The only bad thing about Highcharts is that it doesn't (yet) work on Android, because Android doesn't support SVG. The next version will supposedly have SVG, but in the meantime the developers are working on an update to make this library compatible with this OS, as can be seen on github. For now, Android users will see the old school bar graphs instead. (I'm an Android user so yes I take this problem quite seriously.)

Ok, so we do have one last piece of Flash left on our site, that being the maps you see in Locale->Global map. If anyone knows of a similar non-Flash library, do share.
54 comments |   Jul 13 2011 9:04am

On demand email reports (PDF's coming soon)

In your email report area, there is a new option to "send now" next to any of your email reports. Going here will let you send a report immediately, instead of waiting for the next day/week/month. You can choose a custom date range for the report, and enter in different email addresses than the default. These will be inserted into a queue, which will be processed once per minute, so you should receive it within a few minutes of requesting it.

We are also investigating native graphing libraries, such as highcharts, to replace our flash-based graphing system. Once this is in place, we will be adding PDF as an attachment option to our email reports, which will essentially be screenshots of how your reports look on the web site (minus the sidebar). This of course also means that the graphs will work on all modern mobile devices. Really, that's the main motivation to move to such a system, but having them in the PDF reports will be a nice bonus.
8 comments |   Jun 22 2011 5:40pm

Big screen mode + faster page loads

We're testing a new feature we're calling "Big screen", which is a self-updating, single page report highlighting your key metrics, designed for that giant plasma hanging on your office wall. It looks radically different from the rest of Clicky, because we felt a dark theme made more sense for this type of feature.

It grows and shrinks with your browser window, up to 1080P. Our developer also made sure it looks awesome on the iPad and iPhone - yes, including the graphs.

For now, you can find it in the sidebar menu when viewing your reports, although I'm not sure that's the best place for it. I'm thinking it would make more sense to launch it from the user homepage. So it may move around on you. Really, we just wanted your feedback on the feature itself before we finalized anything, and we figured more of you would see it if it was in your site's sidebar menu. Let us know what you think.

It is available to all users right now but will become a Pro-only feature by the end of the week.




Faster page loads

Our aging web server has been replaced by a server that I was guessing beforehand to be 100x as powerful as the old one, and it turns out I was almost exactly right. The 15 minute load average on our old server was typically between 5.0 and 7.0 during peak times. The new server is between 0.05 and 0.10 during the same hours. It's a beast.

The result of this is much faster loading times across the board. The only thing that's still very database heavy/dependent is filtering visitors and/or large date range reports. Those will still load faster, but since the total loading time may be a bit long either way, the improved performance won't be as noticeable as the rest of the site.
52 comments |   Jun 20 2011 12:45pm

Better bot protection and backups

Two complaints we receive fairly often are that too many bots get logged, and backups on Friday night are annoying. Well here I am on a Friday night letting you know that things are looking up!

Bots

Let me first be clear that this problem is not unique to Clicky. Most bots don't interact with Javascript, so most are not logged by Javascript-based trackers. We also have a fairly big regular expression that aims to filter out any that do the Javascript thing, and it works pretty good. I think we are definitely one of the best at filtering out bots already, but the complaints keep coming in. People see it as a defect of Clicky, even though it affects every tracker. And the bots keep getting trickier.

Both Microsoft and Google have started sending out bots in disguise in the last year or two, the theory being that they're ensuring your content doesn't appear differently if your web site thinks it's a regular visitor instead of a crawler. These bots have "real" user agents so you can't tell they're bots. However, a few people pointed out something unique - their user agent is always Windows XP / MSIE 6.0, and they always report a screen resolution of 1024x768. That alone is not enough to filter out a visitor - chances are good someone on IE6 has a real dinosaur of a computer on their hands - but since Clicky tracks organizations, we can dig deeper. When we look up the organization info for these visitors, if it's Google or Microsoft, we can be 99.9% confident this is definitely a bot. (Because if either of these extremely rich companies still seriously have computers this horrible that are used by employees... well, they should be sued).

The problem was however, we didn't look up the organization of a visitor until after that visitor was inserted into the database. But tonight, I re-arranged some things, and now we check for those three unique factors - XP, IE6, 1024x768 - before inserting into the database. If we have a match, we'll look up the organization immediately and pull a little preg_match("#(microsoft|google)#i", $organization) magic out of our hats, and if it returns true - BAM. Not logged.

There will still be bots who sneak through, I'm sure of that. However, Google and Microsoft seem to be the biggest "problems", and I've never seen what was obviously a bot from either of them that did not have XP / IE6 / 1024. They might update that in the future to make our lives more difficult again, but for now I'm confident this will eliminate almost all of the bots that we log that shouldn't be getting logged. Yay!

Backups

We do full database backups every Friday night starting at 10pm PST (GMT -7), during which traffic processing is halted. As the databases grow in size, these take longer and longer. We were investigating improvements to this process earlier this week, and I realized I was not setting a flag that would basically cut the time needed to do the backup in half. This was a horrible oversight on my part but I'll own up to it, and this has now been fixed. Most databases complete their backup in 1-2 hours, but some that are 3-4 years old were getting near the 4 hour mark. Now, the max any of them should take is about 2 hours, and most should be an hour or less.

But wait, there's more! We're going to be moving to a new database engine in the near future (goal: 3 months) that will be much more backup friendly. We won't have to halt processing at all while the backups are taking place. That will be a nice change. ^_^

Good night!
19 comments |   Jun 10 2011 8:19pm

Next Page »




Copyright © 2019, Roxr Software Ltd     Blog home   |   Clicky home   |   RSS