Page Tagging (cookies) vs. Log Analysis
There are different ways you can collect data about your website visitors. One way is to analyze the log files your web server creates. The other way is often called page tagging. This involves placing javascript tags in your website code. This javascript sends the tracking data back to some analytics tool.
Each has pro’s and con’s, but fortunately you can use both methods combined if you’re using Logaholic.
But first, let’s take a look at the differences.
The differences
Your website is hosted on a webserver that creates log files, these accurately store any kind of visitor movement on your site. This ‘Big Data’, is difficult to read, complicated to transport to a central server, and impossible to digest in raw format.
Logaholic can be setup using log files as main data source to report valuable visitor behavior in a understandable format. It then imports (this process is called “Parsing”) visitor data into a local database that accurately stores traffic history on the visitors to a Web Site.
Most other analytics programs like Google analytics use a different (source) data collection method: Javascript based tracking (also Page Tagging). Through an invisible snippet of java code in a web page the “Javascript Tracker” tags a visitor with a cookie and sends data back to a central server.
Which method is best for you is hard to determine without knowing your situation. Both methods have their merrits, despite the fact both methods will show significant different traffic numbers there is not a good or bad method, actually they compliment each other.
You might already know that a large part of your website traffic comes from non-human visitors (bots). This could be search engines crawling your site, bots with darker intentions or maybe a mailfunctioning script in your own site.
It’s good to monitor all traffic, pages should be sufficiently crawled by search engines or from a security point of view it’s good to know if your site isn’t being compromised. Javascript based tracking relies on the client when this client does not support java this method fails to report in that case traffic is invisible. Web Server Log files record human and non-human traffic information.
Javascript tracking is not as accurate as we would like it to be because of:
- Visitors that use browsers with Javascript disabled.
- Visitors that Block/Delete Cookies.
- Cookies that time-out.
- Impatient visitors that click away before the tagged page loads completely.
- Above mentioned bots of which the majority does not support Javascript.
An Important reason for using webserver logs could be site errors. Nothing more frustrating than to discover visitors are lost because of broken links, missing pages or other errors. Although it is possible to track customized 404 pages, Javascript based tracking is not really suited for error reports. Log Analyzers do provide tracking of every site error (e.g. 301, 302, 305, etc.).
On the other hand, there is also some data that can’t be recorded by the log file, things like screen resolution, load time and scroll depth of your visitor. The Javascript tracker has an advantage here.
Logaholic supports both Javascript and webserver log methods. By simply inserting a special tag in a web page, the already detailed log file information will be enriched with data that can only be obtained by javascript. We have named this capability “Hybrid mode“.
Logaholic supplies a uncompromising view not hindered by any limitations that an exclusive SaaS solution has or that ad serving interests might lead to favoring the not so accurate method. We simply want you to use the best tool for your specific situation we don’t need to sell you anything else.
Using Google Analytics to track your ads? Read this!
Above is highlighted that both mainstream data collection methods will report significant differences in your website traffic. Obviously advertising conversion rates will also differ. Combining results of both data collection methods will lead to more accurate insights crucial for staying in control of performance based ad-buying costs. Not a very comfortable feeling having to give your ad provider control over your credit card and knowing you are not able to verify ad-charges.
Web server logs contain ALL website traffic. Javatrackers don’t and may miss a lot of (paid) clicks. Known fact; a percentage of ad clicks are ‘false’ or violate the terms of your paid traffic provider and should not be charged. Consequently false clicks are subject to ‘Click Fraud’.
Experts estimate that 15-20% of all paid traffic (ppc) is in a grey area. Not being able to conduct proper forensics with a reliable tool is costing you or is going to cost you. Advertising charges are impossible to verify with a java tracker only, you have to at least use a tool capable of analyzing web server logs, Logaholic supports both methods.