When I go to a web site I expect to get cookies from that web site to enable me to interact, especially a web site like the Linux Format Forums. However, I kept noticing these very odd cookies turning up in my cookie jar, so this time when I started Firefox from scratch, I checked the jar every time I opened a site. A I tried each of my regular sites a few time and found whenever I accessed Linux Format forums, I got a cookie from them and some from two other people:

hitbox.com
ehg-futurepub.hitbox.com

as well as
linuxformat.co.uk

So, I put the first two on my block list, and can see exactly ZERO difference between getting hit by them and not.

So, can please ask, just what are these extra cookies there for and what are they doing? And why they couldn't do that with the cookies already in the linuxformat cookie?

Deadly Ernest
(all typos are the fault of the
server gremlins, not the writer)

hitbox.com is part of the Omniture web analytics sites. The ehg-futurepub.hitbox.com will be the cookie that does the reporting back to provide Future Publishing with information on number of visitors, visitor frequency, visitor duration and visitor locality. There may be other information extracted from browsers used during the visit.

Future Publishing could try collecting, collating and analysing similar data but it is probably cheaper to have a specialist data collecting and analysing company processing the data from all of their websites than employing programmers and analysts themselves.

ollie wrote:hitbox.com is part of the Omniture web analytics sites. The ehg-futurepub.hitbox.com will be the cookie that does the reporting back to provide Future Publishing with information on number of visitors, visitor frequency, visitor duration and visitor locality. There may be other information extracted from browsers used during the visit.

Future Publishing could try collecting, collating and analysing similar data but it is probably cheaper to have a specialist data collecting and analysing company processing the data from all of their websites than employing programmers and analysts themselves.

Well, it looks like they'll have problems with me as I've now blocked those two sites and they're getting squat. From a blank browser, I hit the forum and get four sets of cookies, Linux Format, two from ehg and one from hitbox - hell, what do they need with so many for? And if it's important, why not just have them with the linux format ones - it's got nine in it anyway (what the heck they need so many for is beyond me too).

Deadly Ernest
(all typos are the fault of the
server gremlins, not the writer)

Technically, for Omniture (hitbox.com) to access the cookies they must be requested by a server from the same domain, so Omniture can't access any cookies set by the linuxformat.co.uk server. For more information on how cookies work read The Unofficial Cookie FAQ. This FAQ also addresses the privacy concerns of receiving cookies.

I seem to remember something, many years ago, about setting cookies up so site A sends them out and site B gets the information. I wasn't all that interested, but I remember some of the discussion at the time, the data collection takes place at site B, but the ability to trigger the data send only happens from site A, so A asks for the data, my systems sends it and it goes to B.

Failing doing it this way, then anyone wanting to collect information of my system damn well better be doing it from within their own server set up as every time I find a cookie from a source I don't know, I block that source. If it's not important enough for them to do it at their site, it's not important enough for me to provide it.

Doubleclick haven't gotten any thing from me in years.

I did have one organisation I where I couldn't access their web site, so i complained, they did tests and informed me that there web site wasn't allow access as I had cookies turned off. After much investigation, it turned out their system wasn't allowing access as I was blocking their ad-sense cookies. I then complained about that, the web master said stiff, he was on contract and lost the contract after I complained to the web site owner. The Departmental Secretary was NOT impressed to be asked by the Ombudsman to explain why people couldn't access a public government department's web site unless they gave them permission to gather data for a private company. Investigation showed no such authority was given by the department, the contractor was doing it for his own reasons, something about extra payment from someone. That was enough to terminate the contract for cause, and a change to the web site.

Deadly Ernest
(all typos are the fault of the
server gremlins, not the writer)

Deadly_Ernest wrote:Failing doing it this way, then anyone wanting to collect information of my system damn well better be doing it from within their own server set up as every time I find a cookie from a source I don't know, I block that source. If it's not important enough for them to do it at their site, it's not important enough for me to provide it.

The information is being collected for Future Publishing, so even if they did do it in-house, it would be a central setup from a futurenet.co.uk domain, not a separate one for each magazine's site. But why reinvent the wheels? Where's the sense in employing programmers to duplicate a system that is already available?

"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)

Deadly_Ernest wrote:Failing doing it this way, then anyone wanting to collect information of my system damn well better be doing it from within their own server set up as every time I find a cookie from a source I don't know, I block that source. If it's not important enough for them to do it at their site, it's not important enough for me to provide it.

The information is being collected for Future Publishing, so even if they did do it in-house, it would be a central setup from a futurenet.co.uk domain, not a separate one for each magazine's site. But why reinvent the wheels? Where's the sense in employing programmers to duplicate a system that is already available?

Because when I see a cookie from an organisation I know I visit, I know it's legit; when I saw a strange one I don't know where it came from, I wonder about it, so I take not chances and no prisoners. The cookie from their web site comes from them, so if they want extra info, then they put in the extra work, if they get paid for that info of throwing that add at me, then they work for that money.

There are many accounting services firms out there, why do some companies do their accounting in house, the same with IT services? The answer is they want control of it, well when you go outside you lose some control, in this case, you go outside and you lose getting the info collected from me.

I may give you approval to collect data, but anything collected by an outside source has no solid guarantees that they won't use it for some one or something else.

Deadly Ernest
(all typos are the fault of the
server gremlins, not the writer)

ollie wrote:Future Publishing could try collecting, collating and analysing similar data but it is probably cheaper to have a specialist data collecting and analysing company processing the data from all of their websites than employing programmers and analysts themselves.

nelz wrote:The information is being collected for Future Publishing, so even if they did do it in-house, it would be a central setup from a futurenet.co.uk domain, not a separate one for each magazine's site. But why reinvent the wheels? Where's the sense in employing programmers to duplicate a system that is already available?

I think it did make sense the first time - in fact, both times. It's just that I disagree with the approach as there's no way for me to validate if the external source is valid or not. So, my answer is, only those I know are collecting info on me for their purposes are allowed to have it. A lot of the external organisations like Doubleclick etc take the information collected and use it for their own purposes or sell it to others, so I don't let them have any info from me. They want my info, they can pay me for it.

Deadly Ernest
(all typos are the fault of the
server gremlins, not the writer)

Firefox used to have the option to reject cookies from third parties but it disappeared with Firefox 2 and is not available in Firefox 3 either. It was a great option that significantly reduced the number of cookies on my systems. I just haven't tried the various extensions like CookieSafe that offer advanced cookie management.

Firefox used to have the option to reject cookies from third parties but it disappeared with Firefox 2 and is not available in Firefox 3 either. It was a great option that significantly reduced the number of cookies on my systems. I just haven't tried the various extensions like CookieSafe that offer advanced cookie management.

I feel like seriously blushing now, I knew I couldn't find that setting in FF2 so I didn't look for it in FF3, but after reading your post above, I went looking again. I use Kubuntu Linux with FF3, and when you go 'Edit' - 'Preferences' - 'Privacy' tab - in the middle it says cookies, and the second check box reads 'Allow third party cookies' it's checked by default, I just unchecked it.

I feel like I just had a 'well duh' experience.

Deadly Ernest
(all typos are the fault of the
server gremlins, not the writer)

The point I was making that you didn't cover was that even if Future did it in house, the cookie still would not come from linuxformat.co.uk, because they would want to collect information across all their publications.

"Insanity: doing the same thing over and over again and expecting different results." (Albert Einstein)

Oh dear. And all the time I thought cookies made my computing time just easier. I am not entirely convinced by the financial argument made here though. After all, we kind of expect all this information on the web to be free, but truth is, it has to be paid for somehow. Seems like its time for LF to do an article (another?) on spam filters