I'm not going to restrict to the OpenGL ES minimums. Rather, I'm going to look for a compromise. There are two different approaches I'm considering:
* try to leak mostly only a one-dimensional parameter. Need to look at which of those MAX_... parameterers can be grouped together as a one-dimensional parameter. I understand that not all of them can at all.
* try to leak mostly information that's strongly correlated with information that can be obtained anyways.

In the Chromium WebGL implementation I am going to
continue to push for the maximum amount of functionality, whether it
be the number of available uniforms and varyings, or the available
WebGL extensions.

Fine, but this will leak roughly 10 bits of identification information, for the most part not correlated with already leaked information, so almost completely adding up with the number of bits already leaked. This can easily make the difference allowing to implement server-side evercookies.

What will leak roughly 10 bits of information? How do you come up
with this value?

I suspect that the range of discreet values returned by the MAX
queries is actually quite small across implementations. For example
I expect MAX_VIEWPORT_DIMENSION will be one of 1024, 2048 or 4096.
across the many, many different models of graphics cards. Therefore
the amount of information leaked will not be so great.