Chris,
Even in the unlikely event of the W3C taking on this role (of UA vetting
registrar) it would not solve the verifiable DNT problem. DNT could be still
set by routers, drivers, proxies, browser extensions etc. and the
third-party advertiser would not detect that.
If the problem is making sure that the user has been properly
informed/canvassed the solution may lie with the UGE mechanism. Publishers
need advertising revenue so they will want script to call the UGE API. If
that finds the API is not supported or that it reports a different value for
DNT from the header signal we could give them a way to report that so their
third-parties can take appropriate action.
The reporting could be via cross-domain messaging but that would need JS to
receive it, and difficult to verify. Maybe this is yet another suitable
case for transparent DNT override using a cookie with a well-known name (and
site-specific cloning of cookies to third-parties).
To recap, what I suggested was a particularly named cookie e.g. W3CTP=X
which would always override the value of DNT. If it is placed (in set-cookie
or document.cookies) it is cloned to site-specific third-parties qualified
by the domain cookie attribute.
It would signal consent (W3CTP=C=1) or its absence (W3CTP=C=0) . If it was
not there DNT would rule. If DNT was not there local law would prevail. The
UA could use it to revoke OOBC in the same UI as DNT UGE.
This gives a solution for EU ePrivacy, COPPA signalling, transparent OOBC
and illicit DNT detection (W3CTP=DNT=I) It has built-in support for sunset
revocation using the expires attribute and also allows third-parties to
transparently signal OOB consent without JS or needing to rely on the
first-party (this last may be contentious).
Mike
-----Original Message-----
From: Chris Mejia [mailto:chris.mejia@iab.net]
Sent: 26 July 2013 22:41
To: Rigo Wenning; Shane Wiley
Cc: public-tracking@w3.org
Subject: Re: ISSUE-151 Re: Change proposal: new general principle for
permitted uses
Rigo, you stated: "If W3C would stop having a process and discussions about
a process and either throw out the industry, the consumer or the privacy
experts, respectively, we could advance within weeks."
I hope you are not suggesting that the way to reach consensus is to simply
kick out your paying members and invited experts, then do the work on your
own? That doesn't sound right to me... Working group members, in both
camps, have brought valid concerns around process and are seeking clarity
and accountability from the co-chairs and staff-- I don't think it's
constructive to effectively respond with "put up or shut up" (I'm
paraphrasing, of course, but that's what I took from your reply to Shane).
Shane wrote: "DNT can be set easily by any technology with access to the
page request header outside of user control" and you responded "...your
assertion is just wrong."
Shane is actually right, the DNT header CAN be easily set by any tech with
access to the page request header, outside of user control (e.g. private or
corporate routers can do this) -- it IS a valid technical concern that we
currently have no way to validate how DNT was set-- whether it was an
informed user choice or not. Check it out with any tech expert, Shane is
right. Until this is solved, it's virtually impossible to distinguish true
signals through the noise of bad signals, and that's a problem for DNT.
Shane wrote: "we'll likely have a high percentage of DNT=1 traffic on the
internet" and you responded "Does that mean you fear that the opt-out system
could actually work?"
Please define "could actually work". If you mean high DNT rates = works,
then your prejudice is clear. In this case, I guess you'd argue that low
DNT rates = broken. What if only individual human users could enable DNT
based on sound education regarding it's enablement, and they decided not to.
Would that define a broken state/mechanism to you, simply because people
chose not to send DNT? Or would you say those are broken users? I for one
advocate for USER EDUCATION and INDIVIDUAL USER CHOICE-- don't you? Btw,
per the rest of your argument, there is absolutely nothing today stoping
German publishers from "opting-back-in" users who employ ad blockers;
likewise, there is absolutely nothing preventing the same publishers from
only serving their content to those users who do not use ad blockers. DNT
doesn't solve this problem, so let's not conflate issues.
Your wrote "the issue is the unrest in the marketplace."
I don't see any evidence of widespread "unrest" in the marketplace; quite
the contrary, as evidenced by growing web statistics. Take online
purchasing as an indicator of market health; the year over year growth of
online purchasing is staggering-- I don't believe anyone will argue
otherwise. So, if there were so much "unrest" in the online marketplace as
you propose, would you expect that consumers would still choose to make
their purchases more and more online? I wouldn't-- it's not logical. Our
industry has invested heavily in brokering trust with our users and this is
clearly evidenced in the numbers-- we don't need DNT to "fix"
anything-- broadly speaking, user trust already exists despite your best
efforts to convince the marketplace otherwise. Now of course there are some
individuals (a relatively small number, comparatively speaking) that don't
trust. Our industry, and browsers alike, have gladly provided those
INDIVIDUAL USERS the mechanism to opt out-- no problem, we respect an
INDIVIDUAL's right to CHOOSE.
Shane wrote "This means sites will need to ask users if they set the DNT
signal and/or ask for a UGE for a large majority of visitors" and you
responded "You don't. You just test the user agent... And you need a lawyer
to tell you what to do? Come on!"
You may be on to something here Rigo. If the W3C TPWG can not come up with
a real technical solution to this problem (something that works in
real-time, on a 100% of server calls), I propose that the W3C take on the
infrastructure and costs associated with providing a "DNT user agent vetting
registry service". The TPWG can set requirements for user agents, then YOU
(W3C) test the user agents, posting the results to a globally accessible
registry. Companies can then poll this registry (daily) for updates, and
will only honor DNT when it's been determined that a user agent has met the
required criteria for setting DNT: an informed user choice. User agents
that want to send DNT should apply for certification from the W3C, and if
they meet the requirements, be added to the registry.
In providing this service, you should agree to an industry & consumer
advocate oversight committee to monitor your work, as well as regular
independent 3rd party audit/accreditation of your service (may I suggest
MRC-- they are good at this). Easy, right? And you need a technologist to
tell you what to do? Come on :)
Shane wrote "This is an "opt-in" paradigm - which we agreed in the beginning
was inappropriate (DNT=<null>, user makes an explicit choice)"
and you responded "Who is responsible for DNT:1 spitting routers? W3C?"
Yes, W3C is responsible, it's your spec. See "DNT user agent vetting
registry service" (above) for next steps on cleaning up the marketplace mess
that's been created.
You wrote "If you can't distinguish between a browser and a router, I wonder
about the quality of all that tracking anyway."
Rigo, this is why you are a lawyer, and not a technologist. Technically
speaking, we are not talking about distinguishing between browsers and
routers, we are are talking about distinguishing between validly set DNT
signals and ones that aren't. You'd need to understand how HTTP header
injection works to fully appreciate the technical problem. The best
technologists on both sides of this debate have not been able to reconcile
this issue. Neither have the lawyers.
You wrote "I do not believe, given the dynamics of the Web and the Internet,
that we can predict the percentage of DNT headers for the next 3 years; let
alone the percentage of valid DNT headers."
True, no one has working crystal ball technology that I'm aware of, but we
do know that despite there being no agreed upon specification in the
marketplace, user agents are sending DNT header signals today. No matter
how many signals are sent, if you want DNT signals to be meaningful to
users, industry adoption is key. Please stop asserting that our technical
and business concerns are trivial or ill informed-- they are not. Most of
your replies below are not helping us get closer to a workable DNT
solution-- you are only further exacerbating our concerns.
Chris
On 7/25/13 12:40 AM, "Rigo Wenning" <rigo@w3.org> wrote:
>On Thursday 25 July 2013 04:39:35 Shane Wiley wrote:
>> Rigo,
>>
>> I feel like we're talking past one another.
>
>We are not. The DAA tells the world that "the World Wide Consortium
>sputters and spits trying to negotiate a Do Not Track standard to
>protect consumer privacy online, the digital advertising business is
>forging ahead with expanding its self-regulation program to mobile
>devices."
>http://www.adweek.com/news/technology/ad-industry-expands-privacy-self-
>reg
>ulation-mobile-151386
>
>This is unfair. If W3C would stop having a process and discussions
>about a process and either throw out the industry, the consumer or the
>privacy experts, respectively, we could advance within weeks. No more
>sputters and spits.
>
>>
>> 1. DNT can be set easily by any technology with access to the page
>> request header outside of user control
>
>The french call that "dialogue de sourds", the dialog of the deaf. If
>you can test the presence of an UGE mechanism, your assertion is just
>wrong. Repeating it doesn't make it become true.
>
>> 2. This means we'll likely
>> have a high percentage of DNT=1 traffic on the internet (some say as
>> high as 80%)
>
>Does that mean you fear that the opt-out system could actually work?
>And that you are deeply concerned that users could opt-back in? If we
>stall, you can time-travel into the next 5 years and talk to the people
>from German IT-publisher Heise: They lost large parts of their revenue
>due to blocking tools. It will be 80% of blocking tools instead of
DNT-Headers.
>They would LOVE to have a way to opt their audience back in. IMHO, if
>the industry ignores the golden bridge of DNT, they will have to cross
>the rocky valley a few years later. As I said, the issue is the unrest
>in the marketplace, that people will buy whatever promises them more
>privacy, even a DNT-spitting router. To your point: you may see 80% of
>DNT:1 headers, but how many of them will be valid according to the W3C
>Specifications?
>
>> 3. This means sites will need to ask users if they set the DNT
>> signal and/or ask for a UGE for a large majority of visitors
>
>As I explained: You don't. You just test the user agent. We both know
>that DNT has two technological enemies: 1/ Cookies + implied consent
>and 2/ DNT:1 spitting routers and dumb extensions. Now the united
>internet expertise in this group can't distinguish between those and a
>valid browser? And you need a lawyer to tell you what to do? Come on!
>
>> 4. This is an "opt-in" paradigm - which we agreed in the beginning
>> was inappropriate (DNT=<null>, user makes an explicit choice)
>
>Who is responsible for DNT:1 spitting routers? W3C? Is this conformant
>to the current state of our specifications? Nobody in this group wants
>DNT:1 spitting routers. That's why we have ISSUE-151.
>>
>> To adopt DNT under the Swire/W3C Staff Proposal (aka June Draft),
>> industry would be agreeing to shift to an opt-in model vs. agreeing
>> to support a more hardened opt-out choice for users that is stored in
>> the web browser safely away from cookie clearing activities (which
>> remove opt-out cookies today unless the user has installed an opt-out
>> preservation tool). This is a significant shift and will not likely
>> be supported by industry. Hence the reason we're pushing back so
>> hard on the current situation.
>
>Your assertion of an opt-in model is a myth and a perceived danger, not
>a real shift in the Specification. The routers are shifting, not the
>Specification. This is just the first sign of market unrest. If you
>can't distinguish between a browser and a router, I wonder about the
>quality of all that tracking anyway. Are we discussing giant dumps of
>rubbish quality data? If so, consumers and privacy experts may relax a
>bit. For the moment, they assume that you can do profiles and things
>and distinguish between users and their devices etc.
>>
>> I believe I'm being as fair, open, and honest about the core issue.
>
>And I do not question that. We even agree that there is an issue. And
>we have a number for that issue. I tell you that your conclusions and
>suggestions will lead to a totally nullified DNT, not worth our time.
>And I encourage you to consider a reasonable solution to the problem,
>not a short-circuiting of the system with an industry-opt-out behind.
>
>> Hopefully we can work together to look for solutions to this
>> unfortunate outcome (unfortunate for industry as I can imagine some
>> on the advocate side would be very happy with an opt-in world).
>
>Again, opt-in/out is a myth. DNT installs a control, a switch. This is
>much more than opt-in/out. BTW, I do not believe, given the dynamics of
>the Web and the Internet, that we can predict the percentage of DNT
>headers for the next 3 years; let alone the percentage of valid DNT
>headers.
>
>Finally, the only ways a company can be forced to honor a DNT:1 header
>is:
>1/ By our feedback making a promise it does 2/ By a self-regulation
>like DAA or Truste or Europrise 3/ By law
>
>I would be totally surprised by a law that would force you to accept
>"any" DNT:1 header.
>
>So lets work on distinguishing the good from the bad headers. We had
>very good discussions in Sunnyvale with the browser makers. They are
>also interested in a solution. There must be a way.
>
> --Rigo
>
>