We Won't Participate as Judges in Wine Competitions: Here's Why

We did not arrive at this position without much thought and
discussion. Ultimately, we believe that transparency and clarity are
core values that should permeate the wine world -- from the creation of
wine, to the marketing of wine, to the writing about wine.

Everything
that happens in those areas should relate in some way to answering this
question: Is this providing more transparency and clarity to the
consumer, or less?

We have decided that medal-focused
competitions provide less clarity and transparency to the wine consumer.
We feel that medals only confuse consumers instead of educating them, and that they provide little real value.

Our
position going forward will be simple: The editors and writers at the
New York Cork Report will not accept invitations to judge wines at
large-scale, blind-tasting events with the goal to hand out "medals" to
"winning" wineries.

We want to explain, and --
this is vitally important -- we mean no disrespect.

The vast, vast
majority of competition creators, organizers and judges perform their
roles with the best of intentions. Often, we find that the wines we
think are best are the ones that win top honors. Anthony Road
Wine Company's 2008 Semi-Dry Riesling winning the Governor's Cup is one
example).

But that cannot and does not change the reality: There are so many medal-awarding competitions
that the events have lost any sense of meaning to the average consumer,
and even wine-loving consumers can't possibly know the significance of a
single bronze or silver or gold medal awarded at the many, many events.
Furthermore, the very act of blind judging a wide range of wines should
be viewed as a parlor game and not some
official declaration of merit.

Good
intentions give way to nebulous marketing

We
can't stress this enough: The organizers of wine
competitions are people who constantly impress us with their enthusiasm
and event planning. Collectively, we have judged at many events and have been invited to judge at many more. We admire the goal of
wading through oceans of wine to sort out the very best for consumers.

The
problems with judging will be addressed below.

Even
the medal winners can't explain much about the meaning of such an
award. Evan recently stopped by a Finger Lakes tasting room that was
drowning in medals. He was told, "Our 2006 Merlot won Silver at the So-and-So Wine
Competition!" He asked the staff to explain what that meant. "Well, it
probably means that the judges liked our wine very much!" they replied. He asked who the judges were. They didn't know. He asked how many wines, by
percentage, got at
least a silver medal. "Oh, I don't think it's very many," came one
reply.

Sadly, that's wrong, by almost any measure.

On Long Island, Lenn has been similarly regaled by tasting room staffers with stories of medals awarded -- often incorrectly. He's up on some of these things, so he often knows that they are wrong when they tell him that their riesling won gold but it actually won bronze. The average person off the street can't possibly know; there are too many medals from too many competitions. Ultimately these medals and discussions of them have become nothing more than white noise, like static on your television.

Medals have almost no
defined meaning that the wineries themselves can even explain, let alone
their consumers. Ask a consumer what a medal means -- really, grab a
customer in a tasting room -- and there's almost no chance they'll be
able to offer anything close to an answer describing where it comes from
and why the judges awarded it.

It seems that
wineries simply hope the use of medals will make their bottles more
attractive. We understand the impulse. The business of wine is a competitive one, and
discretionary dollars are being held tightly. But ultimately a state
that is attempting to attain world-class status does itself a disservice
with an over-reliance on meaningless handouts.

We
can promise that almost every
tasting room customer would be shocked to find out that often the
standard for getting at least a bronze medal is simply to create a wine
that is not mortally flawed. That's it. That's the baseline.

The
first problem with judging: Subjectivity

At
Evan's first wine judging competition, a huge annual event that we won't
name, he remembers a debate over a flight of pinot noir. One judge refused
to award a particular wine a gold medal because, in his words, "There is
plenty of fruit but not nearly enough supporting oak." Evan, understandably, was stunned. A
judge demanding more oak? What next?

He didn't have to wait long to find out. During the next flight, a judge detected a whiff of Brettanomyces in
one of the wines. She decided it was a nice addition to the wine,
adding character. The judge to Evan's right was offended to the point of
near-insanity. "Brett is a FLAW," the judge declared. "And a flawed wine
wins no medal." The other
judge persisted, arguing that it should be a gold medal wine. Evan thought he was about to witness a fistfight.

How can
you or anyone else tell a judge how to evaluate wines? The beauty of wines is that we
have the opportunity to decide for ourselves what makes a wine special.
Now, that's not to say that there aren't clearly discernible qualities
and flaws. But if I love oak and over-extraction in, say, cabernet sauvignon, and you appreciate a more restrained approach, which one of
us is right? If I think the best wines are indicative of where they were
made, whereas you believe the best wines are hedonistic missiles,
place-be-damned, who's correct?

If you sit in
on a judge's panel at just about any wine competition, get ready to hear
the same conversations. And then ask yourself how anyone can possibly
hand out medals when it's over, as if one wine correctly identified that
7x4=28.

The second problem with judging:
Blind tasting

Everyone on the NYCR team has come to love blind
tasting. It is great fun. It is also a bit like a sporting event or game, not befitting the
anointing of medals that ostensibly carry serious value.

There
is perhaps no wine more fitting to explain this problem than Finger
Lakes riesling. The best winemakers in the Finger Lakes often remind
their customers that riesling is a "food wine." It certainly is. It is
versatile, ranging from dry to sweet, and pairs harmoniously with a
range of dishes. Winemakers have such things in mind when crafting their
products. But they are not producing rieslings designed to impress judges in sterile, blind-tasting settings.

Now try to imagine tasting dozens
and dozens of these wines with hardly a bite to eat. The acids are
ripping at your mouth, and in the sweeter flights the sugars seem like a
welcome respite. In the cabernet flight, there is no juicy steak to
accompany a rich wine, and the judges
are left to consider them bereft of that partnering.

But
most importantly, blind judging robs the evaluators of the most
significant parts of the wine -- its context.

Tell a judge he's drinking
cabernet, and he'll immediately try to lock in and ascertain the country
of origin, then the region and perhaps sub-appellation. But the mind is
a funny thing. Instead of simply enjoying (or not) the wine, and
thinking about it individually, the judge begins to add context where
there is none provided. How did the other wines in the flight taste in
comparison? What might that say about this wine? When was the last time I
tasted a wine like this one? Where was it from? Should I allow myself
to believe this is Bordeaux, when I'll feel awfully silly when I'm told
it's from somewhere else?

Delving deeper, we
find that judging a wine that is simply known as cabernet sauvignon is
extremely constricting. We don't want
a Napa cab to taste like a Bordeaux. We expect Chile to turn out
something else entirely. If we're tasting a Bordeaux cabernet that tastes like
Napa, we're bound to be disappointed. But tasting blind, we might convince
myself it's from somewhere else, mistaking place and winemaker
intention. Whoops.

We've had judges tell us that we should forget about figuring out where a wine is from and simply
taste it to see if we like it. Fair enough. But in that one statement, we
see exactly why wine has become so homogeneous, so dangerously banal.
Judges are not required to give a damn about a wine's sense of place.

We find it vital. With no standard, how can we expect judging to be
consistent?

Ah, but see: It's not consistent.
Not even a little.

There is ample evidence
that judging is like throwing darts

When
Robert T. Hodgson set out to research the reliability of
judging, many of us suspected he would find that judging is
inconsistent. Instead, he found that medals are awarded in a fashion
that almost appears to be random. Hodgson wrote, "It is reasonable to
predict that any wine earning any medal could in another
competition earn any other medal, or none at all." Indeed, he found
hundreds of examples of wines that earned gold medals in one competition
and no medal at all in another.

Put another
way: If you make a competent wine, you can enter enough competitions and
that wine will almost certainly win gold eventually.

No
study is perfect, but we suspected that after this study was released,
drastic changes would hit the wine judging circuit. We have yet to see
any. Hodgson stated that his goal was to provide some measure of judging
reliability to help these competitions improve. We see the result being
supportive of the idea that these competitions
ought not exist at all. After all, judging in mass competitions is
putting wine into just about the least most suitable place for good
evaluation and enjoyment.

And for wineries
that might protest, it should be said that the little study that has
been done only indicates that tasting room customers really don't care
much about medals. Why should they? As we've already explained, they
don't know what the medals mean.

Clarity?
Consumers don't know which wineries entered a particular competition and
which didn't, they don't know the judges and what the judges are
looking for, they don't know how many medals were awarded, and they
don't know what a medal is supposed to signify.

That
should say everything.

Our decision, and
our call for others to join us

In the
future we will politely decline invitations to judge at
these events. That does not mean we won't participate in wine seminars,
conferences, etc. This is simply about mass judging. The
wine competition circuit has become quite an industry itself, but there
has to be a good explanation for the purpose it serves.

We ask our
colleagues to do one of two things: Pledge to join us in this decision,
or provide a suitable answer for the problems we've outlined above.
We're more than willing to listen, and to change our minds if it can be
proven that these competitions help the consumer.

Comments

You can follow this conversation by subscribing to the comment feed for this post.

good post, but there's a third way - just judge at the decent competitions. Like wines, there are good and bad competitions. It's difficult to run a good competition, bit some good ones exist. The primary problem is that good judges in these sorts of settings are rare. Very rare.

I should also mention that we will be modifying our own NYCR Wines of the Year competition format to align with this new, public stance.

Jamie: That's a fair point, but is there a list somewhere that will tell us which competitions to agree to and which not? I don't think so :) And even the good ones don't escape the challenge of what medals really mean to consumers.

I'd also like to thank Evan for pulling this post together and doing the heavy lifting for it. Capturing dozens of back-and-forth emails into a coherent post isn't easy. He did it with skill and aplomb.

Brilliant post, and one that I wish I had written. Your writing does much more for the industry than judging. I would wonder though how much of what you say, or suggest can also be applied to the points.

Bravo.
I noticed a while ago that many of the Finger Lakes wineries I consider superior are the ones that spend less time talking about their medal count. In my experience, the best wineries focus on their product instead of awards. When I'm in a tasting room, I'll purchase a wine with interesting characteristics or a wine that will pair well with a food I like. Winery staff members who help me to understand and appreciate the wine will make a sale. A bottle displayed with bling will not.

As Wine Judge, I must agree with some of your points. Fatigue is a serious problem (one that many have criticized Parker for after he mentioned multiple wine marathons).
I would like to mention that it is through these competitions that I was able to bring Red Newt Riesling and Ravines to my customers, I've used 2 X G.C. winner Chateau Lafayette Reneau as a springboard to other Seneca Lake wines and got to really come to know more than a few wine makers. I recognise these are unique circumstances that only pertain to the handful of retailers who qualify to be in these events.

oops hit post too soon- but perhaps the NYCR writers would benefit from trying some wines blind, discovering a few that they felt were superior and doing further research and tasting on those standouts. The IEWC and G.C. are perfect venues for that.

While I completely understand your frustration with the many shortcomings of show judging as practiced here in the United States, I wonder if many of these same criticisms couldn't be leveled at the practice of rating wines in general. For example:

"...there are too many medals from too many competitions."
If this is really an issue, isn't "there are too many ratings from too many reviewers" even more true? Just look at all the magazines, newsletters, blogs, bulletin boards and CellarTracker users--surely those numbers far exceed the number of wine competitions.

"Medals have almost no defined meaning..."
Hmmm...I'm pretty sure there is (or should be) a set of criteria provided to the judges of any show and available to the public setting forth what each level of accolade means. Just as there is (or should be) for any rating system. Unless you'd like to suggest that the 100-point scoring system is so transparent that it requires no elucidation...

"blind judging robs the evaluators of the most signficant parts of the wine--its context."
How important context is to wine judging depends on what the criteria for judging are, regardless of whether we're talking about medals or points. If the criteria include only sensory assessment of balance, length, intensity and complexity, the need for context is minimized. If the criteria include some notion of typicity (however defined), context is vitally important. And while context may be important, it is often more important to avoid any perception of impropriety or bias that can result when tasting open label. Some context may be gained while still tasting blind by appropriately narrowing the classes; e.g., "Aged New York Cabernets, four or more years post vintage." or "Young New World Rieslings, no more than two years post vintage."

"...judging is like throwing darts."
Ummm, yeah, but... A wine faring differently in one competition to another is not particularly different from one reviewer liking a wine and another not--and I don't think anyone would find that particularly unusual. So because wines can earn different ratings from different reviewers we should stop rating wines? I believe the same study also found that certain judges were more consistent than others, which should provide some basis for improvement.

I've only judged at this sort of wine show once, a long time ago, and I understand it can be a frustrating experience. But I think probably the only way to make the system better is to participate and work from within to improve it. Removing yourselves from the system may make you feel better, but having less qualified judges taking those seats and awarding those medals won't make show medals more meaningful or less confusing for consumers.

From my perch, I think it's a good idea for untrained people to stay away from judging wine, but that would eliminate a zillion wine critics, too, and those useless points that critics issue.

Your blanket statement against blind wine evaluation is completely one-sided and from the standpoint of an untrained evaluator. If you want to understand the value of blind wine evaluation, which is for quality control reasons, you should attend the workshops that Cornell U operates for winemakers at the agricultural station in Geneva, NY.

But I agree that the blind tasting used for many competitions is flawed.

Jamie - I'm an enormous fan of your work, and it's great to get feedback from you. If your suggested third way could be easily achieved, it would be worth discussing. Ultimately I keep going back to Lenn's point in the comments, which is that I'm still not sure it clearly serves consumers. But can I ask: Can you add an example or two of the best competitions? Again, thanks for stopping in.

Joe (and Ryan) - Regarding wine ratings, Lenn and I like to rib each other because I don't favor point scores in any form. But I have much less issue with point scores than blind tastings. First of all, Lenn can taste one wine in a day for review, and he can taste 15, but it's not going to be several hundred. Palate fatigue is not a potential problem as it is in judging. Also, Lenn tastes non-blind and cares deeply for the background, the approach, and the story. Wine writing is subjective, and we don't apologize for that part of it. Readers can decide whether they like the approach or not, or even if they like the approach but don't like a reviewer's writing or evaluation priorities. But there is more context in a NYCR review. Finally, readers can decide if the reviewer is consistent. They don't have that opportunity with judging; they almost never know who the judges were, and even with that information, they might not have any idea what those judges typically favor.

Ed - Speaking of fatigue, great point! Not sure why we didn't stress it even more in the original post. This is a major issue. Now, you bring up another very important point in this debate: Often, judges will travel and have a chance to experience regions and wines they otherwise might not. For example, the Finger Lakes International Wine Competition, held in Rochester each year, offers judges a chance to travel to select Finger Lakes producers for tasting and dining the day after the competition. It is a wonderful idea, and very effective. I simply hope the industry would find a way to continue to bring in retailers, writers, and professionals in lieu of these judging events.

Ryan Love - Not a big surprise, is it, that your idea of the best producers are exactly not the ones bowling you over with medal results. Great observation.

We knew this would generate great discussion. Glad to see it happening.

First, I think it's important -- or would at least be great -- if we could keep the discussion to the topic at hand. Adding scoring/rating/points into this discussion (especially in a NYCR context) gets away from the main argument.

Thomas: You said "completely one-sided and from the standpoint of an untrained evaluator" but didn't expand on it. Could you? A winemaker blind tasting his or her own wines is a bit different than judges at a competition doing it.

I think it's important to say that the NYCR crew blind tastes all the time. I actually taste blind regularly if I'm reviewing several of the same variety/blend -- at least on day one. Then I un-bag the wines and obviously know what they are on days two and three.

To me, one of the most important things -- and why I taste for 3 days with and without food -- is that NO ONE is drinking wine the way that judges or most critics are tasting them. I won't even score a wine based on an at-winery tasting because spending a minute or even five with a wine is an artificial construct.

We try to be very consumer- and reader-focused here on the NYCR. They are the only reason we use the 100-point scale -- they asked for it.

Who is asking for medals? Obviously wineries love to use them as marketing tools, but I think that's the easy way out and not all that effective. When EVERYONE has a gold medal, no one is differentiated.

Oh, and did I mention how much I have disliked the judging gigs I've done? It's exhausting, mouth-raving work and no where near as educational as sitting down with a wine (or even 15 to Evan's point) and spending several days with them. I've tasted wines on day 1 and thought "wow, that's not very good." But a day or two later the wine was transformed.

If I tasted that same wine at a competition, it wouldn't fare very well, right?

I guess in some ways I've brought scoring into the discussion after saying not to. Sorry about that. :)

See, the scoring discussion is good to have right here, right now. Because your method does indeed allow for some evolution, some "opening up", and the chance to see how the wine plays with and without food. Not everyone rates wine that way, and as much as you and I might disagree with scoring in general, I simply have to admire the way you do it. Your method is so different than wine judging, well, it's like comparing apples and anvils.

The oldest expression in the wine world is the one where "wine doesn't travel" and "it tastes better at the winery than ..."

CONTEXT is the key to an intellectual appreciation, and sometimes a deeper affection, and certainly a better understanding of a particular wine.

I'm not sure who has ever come away from a blind tasting with a deep appreciation of a wine - not in comparison to drinking a wine in proximity to the vineyard and/or with the winemaker present.

If that is LESS honest, than so be it. I'd much rather read that story - and I'd rather appreciate that wine vicariously through the writers on NYCR, than hearing about a silver medal produced by a clinical dissection of wine's flaws or lack thereof.

"Thomas: You said "completely one-sided and from the standpoint of an untrained evaluator" but didn't expand on it. Could you? A winemaker blind tasting his or her own wines is a bit different than judges at a competition doing it."

Lenn,

First, I said nothing about a winemaker tasting his or her wine blind. Where did you get that from? The training at Cornell is for those of us in the wine business to learn how to accurately identify the things we often run around telling other people we think we know about wine but often find that we know less than we think.

Evaluating wines for hedonistic purposes is never going to get anyone to a point of agreement, whether the evaluation is done blind or in context. Aesthetic evaluations are personal and subjective--period--they have little to do with the wine and more to do with the evaluators. I know that seems cynical, but no one has come up with any system of evaluation that has proved otherwise to me, except for those training programs designed to hone our sensory equipment.

To truly evaluate a product you must first agree to and establish standards (no standards exist in the critical wine evaluation world); then, evaluators must be trained to identify the things that have been set as standards and the things that fall outside those standards and are considered flaws; then, the evaluation process must include a test of the evaluators for their abilities and their accuracy (this is done by intentionally adding into an evaluation both flawed products and duplicates).

Brett is a perfect example of the fallacy behind hedonistic evaluation. Some people hate any noticeable Brett and some people accept a certain noticeable level. Plus, each person has a unique threshold for identifying Brett (and other things in wine).

More important, however, is the question of whether evaluators know the true problem connected with Brett; the one that has an effect on the course of a wine's life span in the bottle. Only trained people can even come close to talking about that issue.

Brett is only one intricate component in wine that many critics are woefully ill equipped to evaluate.

Unfortunately, far too much of the discussion about wine is not about the wine but about ourselves and our perceptions. For that, all you need is an opinion. This situation is what renders most wine competitions and most wine criticism no more useful to me than handicapping a horse race.

"For that, all you need is an opinion. This situation is what renders most wine competitions and most wine criticism no more useful to me than handicapping a horse race."

Thomas and Lenn seem to be saying the same thing from different angles. I will add to my comment insisting that Context is key and add Thomas' valid points about Opinions.

The opinion of a group of "untrained" sommeliers, "untrained" journalists and bloggers, and "untrained" wine buyers, when gathered up in all of their blindness and distilled down to one "winner" is indeed flawed, I think we all agree on that.

I think a good point to make is that the Opinion of the staff of the NYCR, when derived from a contextural point of view, and presented transparently, is far more valid and valuable than the mathematical rendering of the contest.

I use quotes to highlight the "untrained" not to mock Thomas' words at all, but to point out the clinical evaluation of wine for quality control has no bearing, nor relationship, to the processes that go towards wine appreciation - just technical evaluations.

Cornell's program is no doubt rigorous, but it doesn't promote the warm and fuzzy feeling we're all searching for, does it?

I was intrigued by this interesting discussion today. After a lengthy discussion tonight on Twitter about the relevancy of wine competitions, I felt we needed to ask non-industry consumers "If a wine wins a medal at a wine competition, are you more likely to purchase it?" See the more than 60 responses posted at http://www.facebook.com/fingerlakeswinecountry.

"...you should attend the workshops that Cornell U operates for winemakers"

Geez, Len. Nowhere does that quote say that the winemakers are tasting their own wines. The Cornell programs are geared toward winemakers to hone their skills, but anyone in the business is welcome to attend, and I reccomend the programs to all who want to learn more than they think they already know.

Jim,

If wine evaluation is about warm and fuzzy, then why all the fuss about how it is done? Is it that those who evaluate would have consumers believe that they come at it with a level of knowledge or an insight that others do not possess?

I'm not so sure that disagreeing with someone's opinion makes that person's opinion any less valid, and I'm not so sure that agreeing with it makes it any more valuable. But an opinion that is based on knowledge and training, and can be demonstrably measured, well, now we have something on which to pin our hopes.

Why is it that every time someone calls for trained wine evaluators, someone else brings up the canard about "making it too technical?" When you ask a professional in any field for an opinion, wouldn't you prefer that the professional know the subject in dept?

The reason this conversation concerning competitions and wine evaluations is an evergreen is precisely because the activity rests on sand.

If you don't like the way wine competitions work, you can work from within to try to make them better or you can pick up your ball and go home (like you've just announced).

The game will still go on, you just won't be a part of it. And the game won't be better for your absence, or less confusing for the customer.

And once you're out of the game, complaining about the rules just comes across as griping about a competitor. You rate wines--and what else is a wine competition but another way of rating wines?

Yes, wine competitions have their drawbacks. But so do individual reviewers and the various methodologies they employ. It seems a bit disingenuous to say on the one hand that the democratization of wine criticism in the form of CellarTracker reviews, bulletin boards, blogs is a good thing--that everyone's impressions are equally valid--and yet slam competitions, because many of the same criticisms apply.

The dynamics between large wine competitions, wine critics ratings and consumer-driven reviews are so completely different that they defy comparison.

At the same time, if a wine can manage to rise above competition in all three of these arenas, then it is pretty certain to be not just good, but *really* good. Very few wines can achieve this, which puts us all back at square one, which is that people are potentially faced with SOOOOOO many wines and are still looking for guidance. (Personally, I think they are better off getting that guidance at the point of sale, but that's another issue....)

My suggestion to you - Lenn, Evan and gang - is to continue to do what you do, but develop a system that is unique to you... NOT points and NOT medals. (The drawbacks in both of those systems are growing more glaring with each passing month in this Internet-fueled age.) Keep that up and ultimately people will simply know that the first resource for advice in NY wine will be what NYCR thinks.

This isn't about "taking our ball and going home." We've obviously given this a lot of thought. In our view, the best way to make a change in wine judging is not to do it at all, because there seems to be no good way to do it in mass formats that would offer any real, discernible benefit to consumers. Staying involved in some small way would only uphold the notion that these competitions serve a public purpose. Unfortunately, the evidence suggests that they do not.

Again, look at the research that shows that just about any halfway-competent wine will win gold if entered enough times. How can we effectively work to change a format that leads to such results? We're not blaming organizers or judges. They work hard and they're talented folks.

Wine is simply one of the most complex things in the world when it comes to evaluation. Reducing it to sterile, palate-busting mass tastings is one of the least effective ways to find out if it's any good.

We've already addressed the scoring issue, but you haven't responded to our points. Take Lenn's very important point that he tastes wine over multiple hours and even days. We all know that some wines are tight on opening and show better with more time or alongside a meal. This will never be possible in such settings. We also know that some wines open with a blast of something unpleasant upon opening, and that unpleasantness can blow off. Should those wines be penalized because there's not enough time for them to show their full profile?

I recently saw a Finger Lakes winery boasting of a gold medal for a 2006 Merlot. I've had the wine. We all know that 2006 was a largely disastrous harvest for red varieties in the Finger Lakes, and the results show it. That same Merlot was entered in numerous other competitions and did not fare very well. But to consumers, they're only told it's a "gold medal wine." They're not told, "This wine won one gold medal, but struck out in a bunch of other competitions."

It's like a baseball player who strikes out in his first 20 major league at bats, then hits a home run. Then he strikes out 10 more times. Any team would send that player back to the minor leagues. But by wine competition standards, the fans would not even know about the strikeouts! They would only know he's a home run hitter, based on one unlikely thunderbolt.

There's just not a good, simple solution that we can see. Our hope is that if others make the same pledge, competitions will either have to change completely or go away. In the interests of consumers, that's the best result we can foresee.

Morgen, Tish - Thanks for the comments and links. Tish, very interesting work on your medal story this month. Can you post a link here so others can get to it?

Thomas - It's not an old canard to target the technical aspect of this endeavor. It's a valid question. Yes, wine can be evaluated on technical merits, but these competitions create so many problems in truly finding the best wines that they simply become "everyone-gets-a-medal" events. It's worth discussing whether they can arrive at any results at all that are truly beneficial to consumers.

Two people whom I respect deeply, and some of the best judges I've spent time with, are Chris O'Gorman (from Merryvale Vineyards in California) and Lorraine Hems (a wine educator at RIT and NYWCC). They plan to stop by and comment, and I look forward to that. They are both smarter than me - by far! - and I think they'll add to the discussion.

We appreciate the depth of thought going on here. I think this is productive, even if some of us will have to agree to disagree. Cheers.

5 days ago, before this article was posted, in the discussion about the Governor's Cup, I made a comment which is appropriate to reproduce here:

In Europe they do not fuss so much about wine as they grow up in a culture where wine is served at the table and they probably neighbor a grape growing region. Wine is seen as a beverage to enhance the meal.

Here we have somewhat of a different attitude reflecting our never ending quest for(unattainable)perfection. And when it comes to wine it is a relatively new phenomenon. Therefore we have developed a whole bunch of guidelines to help the uninitiated. We try to make a scientific endeavour out of wine tasting developing rating systems which are supposed to provide a numerical ( as if there was such precision) evaluation of a wine with all sorts of scales the most favored being the 100 points scale which start at 75!

When I have wine at dinner I do not use a rating system or a scale. I tend to reflect on how well the wine goes with the food. Is it enhancing the food or is it taking away from it. And if the wine is extraordinary how does it differ from my expectations.

The point of all this is that sometimes it is useful to pull back and reflect on our discourse. There is no such thing as the best of anything. Like there is no best car, or best tomato or best carrot or best computer without some context by which to have that evaluation. Similarly there are today an increasing number of very good wines and it is the consumer's good fortune that there is so much to choose from. If I were to drink 10 different wines a day for the rest of my life I will have tasted but a fraction of all available wines. The same is true for self proclaimed or otherwise annointed wine judges. They could not possibly have tasted more than a fraction of the wines of the world. When they judge a wine, we have to ponder, are they telling us about the wine or about their own taste for wine. For those of us who have the self confidence to enjoy our various wines we are not necessarily impressed by wine judges or competitions. And those of us who are curious will attempt to taste more wines we do not know than keep going back to the same wine. And those who are more conservative will keep buying the same wine over and over again because they have found what satisfies them and do not need to further experiment.

Dan Kleck, a winemaker from the early North Fork generation, once told me that if you want to have a wine cellar on the cheap, start a wine competition. You get 2 or 3 of each wine taste 1 or 2 and save the 3rd one and get paid for the whole exercise. A bit cynical perhaps but packed with wisdom.

As a political reporter in my day job, I often hear politicians bemoaning the federal earmarking process. But then I ask if they will stop bringing pork back to their home districts. They all say they will not. They say, "The process is terrible, and it must be changed. But I'll work within the system to make that change."

How is a wine entered into multiple competitions until it wins something different from one that is sent to multiple reviewers? In either event, the winery will promote the top result and consumers will be largely in the dark about other, less flattering, results.

That is about marketing and the use of ratings of any sort to promote wines--not limited to wine show medals.

Your political analogy made me chuckle, but folks leaving congress have yet to make the system better, just like voter apathy concentrates power in the hands of certain--generally more extreme--factions.

One of the obvious reasons many wineries enter wine comeptitions, especially small wineries, is that they cannot get reviews of their wines to begin with, so the wine competitions are the only place they can get what they can account for as a review. Small wineries make big deals out of their medals, because they can't get a review from their local paper, let alone the NY Times, the WSJ, Wine Spectator, The Wine Enthusiast, Robert Parker, or Steve Tanzer, or even a wine blog as prestigous as NYCR.

They have nowhere else to turn. The competitions rate their wines - Gold, Silver, Bronze. And that's something they can show their customers as a rating from the outside world about the quality of their wine.

However imperfect, the competitions have been performing this service for many years.

As for their quality, it varies, just like animal judgings. What one judge puts up for Best in Show, another wouldn't recoomend for Best in Breed. No matter whether it's the AKC or 4H. People generally have good intentions. It's subjective and antique. So is wine reviewing. But it's the best we can do.

Just to be clear before I depart the discussion and await Lorraine Hems, whom I've known for a long time and am sure she will contribute something smart to the conversation:

I'm not saying that wines should be evaluated only for the technicals; I'm saying that those who evaluate wine ought to be formally trained to do so, and that a set of standards by which to gauge both the evaluation and the evaluator ought to be established.

The reason this issue keeps popping up (and you guys seem to be saying it, too) is that there are no consistent standards either from competition to competition or from reviewer to reviewer.

Maybe I'm a true cynic, but I don't believe that people are magically endowed with the skills to evaluate wine just because they drink the stuff.

Carlo - You bring up a very important point, and you state it well. It's too bad that smaller wineries can run into trouble getting reviewed. Obviously the NYCR seeks to put up no barriers to coverage, but for major publications, it is a challenge, certainly. Thanks for reminding us about this.

One theme that I'm noticing in the comments here is how useful the medals are/can be for the wineries.

That's not really our point here though -- we think they do consumers a disservice.

The other theme of rating wines being as problematic as judging them is iffy (at best) in my mind. As Evan said already, if people come to this site and see a score, they know that I gave it. They can very easily click through the site and the reviews posted here, likely find a NY wine that they are familiar with and learn over time if our palates are similar. I've done the same thing over the years with WA and WS and know what critics my palate is more similar to than not.

Even if someone who sees the medals awarded at a big competition knows who awarded them (rare) they likely won't know who the person is. And because many judges are not writers, the consumer (even if he/she tried) would have a hard time gaining an understanding of the judges preferences and track record.

This isn't about discounting the abilities of the judges themselves. I've worked with outstanding palates and minds at competitions. It's the system that is at fault.

totally agree with Jamie on this. Rather than write-off the competitions, a better play might be to only agree to those competitions that have merit.

You will never be able to control the deceptive use of the results, but that doesn't make the process / competition itself bad.

I myself am a big critic of wine medals because I think they offer little consumer value - but we can only change that by making the awards more legit and giving consumers a better frame of reference (i.e., helping to create the Superbowl of wine competitions).

Hold on a second Thomas...do I need a course from Cornell to evaluate and/or judge wine - because I can't afford Cornell University. But I do have 22 years of professional wine experience. And I judged the recent Florida International Wine Competition.

Am I magically endowed - or just you?

Are we evaluating the hedonistic/intellectual aspects of these wines or are we like diamond-cutters looking for flaws?

Dude - At the very least, in your scenario, we'd have to do two things:

1) Ascertain what makes for a legit or good competition. What sets them apart? I'd love to hear examples. And

2) Start educating customers aggressively about medals, and push wineries to have more info handy. For example, when I see some BS gold medal for a 2006 Finger Lakes Merlot, I'll want to ask: How many competitions did you enter this wine in, and what were the other results? If they can't answer, I won't have any interest in the medal they're plugging.

Never said anyone NEEDS a course from Cornell--just suggested that sitting in on the sensory courses offered to winemakers at the AGRICULTURAL STATION (not at the university) might help develop skills for those who would be wine evaluators. I am sorry if a little learning is scary, but, to me, it is necessary.

"Are we evaluating the hedonistic/intellectual aspects of these wines or are we like diamond-cutters looking for flaws?:

And here you go again with the canard.

No point in me responding again to what I am forced to view as willful misundertsanding of my point.

"Am I magically endowed - or just you?"

Based on that snide response, you took the comment personally. I can't vouch for your insecurities, but I know that I am not magically endowed, and never claimed to be. I know perfectly well the purpose of the personal pronoun and didn't use one.

Argue your points, tell me why you disagree with my position, but don't become abrasive in the process just because you meet with someone who expressed direct opinions; it's unbecoming of a 22-year professional.

Oh, and I top you in the business by four years--I have covered as many areas of the industry as your resume claims that you have, but with the addition of having been a grape grower and winemaker. You see, I can use the personal pronoun when it's needed--I can also do investigative journalism so as to know with whom I am talking.

One point that no one has seemed to mention is that wine competitions are expensive! Paying 50-100 dollars PER WINE to enter one of these things isn't cheap when you start entering half a dozen or a dozen wines in a half a dozen of them. Why do WS, WE and W&S see so many more wines than even the largest competitions? Partly because they're nationally relevant (way more people read the ratings section of those magazines than the results of the Indy Wine Competition) and partly because they're FREE to enter.

I would love to hear opinions on the economics of paying for these medals and how much more wine it sells. The Facebook comments Morgan posted about don't scream "increased sales"

Brad - Just did a midday count on Morgen's snap poll. 40 people said the medals would not lead them to buy. 26 said that, while the medals wouldn't lead them to buy, they might be more willing to taste a medal-winning wine while in a tasting room. 10 said the medals would lead them to buy. Pretty strong indictment, if you want to put value in something like that.

There are indeed barriers to entry, or at least barriers to entering a large number of competitions. And I think consumers truly don't understand that these competitions are not heavily representative of regions, and certainly not representative of the small, artisan producers. If the Minnesota Twins win the AL Central, is that as impressive as winning the World Series? Certainly not, but by wine competition standards, the Twins would just call themselves "champions". Who knows what the competition was?

Blake - Obviously we are not taking a stand simply to prevent wineries from selling a few more bottles in their tasting rooms. And really, if medals and wine competitions were to go away completely, do you think winery staffs would have no other opportunity to increase sales? Your premise seems to be that without these competitions, sales would go down. I'd like to think that without medals, wineries would have to be ever more creative. How is that a negative? There are so many ways for them to improve what they do to reach customers. I don't think a little kick could hurt.

For us, it's about doing the right thing to make sure our readers and wine consumers have the best information possible. If the end result is that wineries lose medals as a marketing tool, that would likely be a net positive, as it would force them to engage in new ways to make up for any lost sales (which probably don't amount to much, anyway.)

Wine competitions are just one of a few valuable tools for consumers to use in making their purchasing decisions in a marketplace that is increasingly filled with a sea of wine.
When folks encounter an aisle or 10 aisles full of wine from around the world, the choices can be overwhelming. A piece of smart POS with a gold medal or a score, or even a hand sell from knowledgeable (we hope) staff can prevent this from being a needle in a haystack proposition.
We all agree that in the realm of the subjective, evaluating wine is an imperfect endeavor and anecdotal mistakes are very easy to find, whether it’s gold for a mediocre wine or Parker whiffing on a bretty or “hot” Super Tuscan.
I fail to see how the reviewers of the NY Cork Report are any different from a panel of judges at a wine competition or even someone who reviews opera for the NY Times. Anyone can have an opinion on wine, and if a consumer takes a recommendation from a particular reviewer or wine competition or blogger and they are disappointed, then they will choose to look elsewhere for recommendations.
I would like to give 95 points to Joe and Tish for their recommendation that the NY Cork Report become re-engaged in wine competitions to improve the process, or… start your own wine competition.

Instead we hope making this stance brings some attention to this issue -- which it is doing on some level -- and eventually helps people understand the reality here as we see it.

Let's face it, while competitions can become expensive, they are easy, mindless marketing. It doesn't take any sort of creativity or marketing savvy to slap a "GOLD MEDAL" on a tasting room sheet or a website. Enter enough competitions with even a decent wine and you'll get your gold. But remember that just about every other winery is doing the same thing.

Christopher: You said "valuable tools for consumers to use in making their purchasing decisions"

Part of our point is that they are not valuable -- not in any real way.

Sure, if confronted with two similarly priced wines, a consumer might buy the one with the gold medal sticker -- but are they getting the better of the two wines? I say it's 50-50. If they are getting the better of the two wines, great. But if they end up taking the lesser of the two wines home, then the medal has just done that consumer a disservice.

The same can be said of scores of course but even in lazy stores, a score is often at least accompanied by a tasting note. That's not true of a gold medal sticker slapped on a bottle.

Can the same thing happen with scores? Absolutely, and that's why people who score wine without any sort of tasting note tied to it are just as guilty.

But I keep coming back to these questions:

Which competitions are run well vs. not?
Who gave this wine this medal?
What are their credentials and track record?
How many wines did they taste during a session and where did this wine fall in the lineup?

With most scoring outlets, it's easy to answer at least most of these questions and then I can decide whether or not to put much value in the ratings.

For instance, I know that my palate is very different from Robert Parker's, so some WA reviews are meaningless to me really. On the other hand, David Schildknecht and I have agreed on many wines, so I follow him with more interest. The same is true on the WS side -- James Molesworth and I have far more similar palates than Thomas Matthews and I have.

You simply cannot gain that level of understanding with these black box competitions where wine goes in and medals come out.

An interesting post right after having judged a local wine tasting by amateur winemakers myself. I can understand your point. I like to select and drink wines for the education, taste and enjoyment - not their medals. It's a nice pat on the back to a winery to get medals in competitions against their peers, but does it say "We're the best"? After a dozen or so wines as a judge your tastebuds will be pretty close to toast. Sip, slosh, spit, water and bread or crackers can help you last longer and somewhat keep the palate clean, but if you are tasting one big oak monster, high alc cab after another your tongue will lose! I understand the reference to having some residual sugar as a relief! As I said, I drink wine not medals. Shelf talkers and neck sleeves can give medal counts and points ratings, but when I ask folks looking for wines in the store what they are looking for they often don't know. "Something I will like" is a common answer. Will a medal help with that?

Tom - Thanks for your honest assessment, particularly in light of the fact that you just finished judging a competition.

Let me ask you: There is a lot of talk about separating the best competitions from the lesser lights; what makes for a "better" competition? How can they improve?

I'm hoping Joe Roberts, Jamie Goode et al will also tackle that. We want to be persuaded on this count, but so far I don't know what would separate one competition to a degree that would alleviate the concerns we've outlined.

I'm glad Christopher O'Gorman stopped in. He's one of my favorite people in the wine industry; he has a great palate, he's an outstanding judge (in my opinion!), he's a Californian with an international understanding of wine, and he's a hell of a lot of fun. I was fortunate to meet Chris at a judging event - one of the benefits of these competitions, to be sure.

Chris, you raise an important point here: Medals represent opinions on wine, just as scores do, or recommendations from friends. I simply worry that consumers don't necessarily see it on those terms of parity or near-parity. The medals have been marketed as something they're not, and marketed without context or much explanation.

That said, I always look forward to sharing a tasting experience with you, whether in a competition or otherwise!

I especially appreciate the comments by Joe Czerwinski, W. Blake Gray, and Tish.

There are three unique wine competitions with which I am involved:
Sommelier Challenge International Wine Competition
www.SommelierChallenge.com
Critics Challenge International Wine Competition
www.CriticsChallenge.com
Winemaker Challenge International Wine Competition
www.WinemakerChallenge.com

The judges for each are as the names suggests Sommeliers, Critics and Winemakers. The names of the judges are listed on each website. These competitions are run by Robert Whitley, nationally syndicated wine writer and publisher of www.WineReviewOnline.com. He is also director of San Diego International Wine Competition, which has been around for 27 years. Robert has been tweeting about this post at @WineGuru on Twitter, which is how I learned about it. This comment is my own.

Simply put, a competition medal is a recommendation, just as a high score or positive review is. It can help give the wine consumer some confidence in his purchase of an unfamiliar wine, although obviously it is not a guarantee that he will like it, since taste is subjective. Similiarly it can help the winery in marketing the wine, however untrained tasting room staff or sales people may not make best use of it.

I love wine, and feel that anything that helps bring people and wine together is a good thing, especially in these difficult times when the wine industry needs all the help it can get. Forgiven me, but I don't see how the current trend of bashing competitions or the 100 Point Scale or any particular publication/reviewer does the wine industry or wine lovers any good.

I'm wondering if the title of the original post has caused some confusion, as it is somewhat at odds with the rest of the article.
Title: "We Won't Participate as Judges in Wine Competitions"
Opening line: "It's official. We are done judging big, blind, medal-focused wine competitions."
That's an important qualifier, and it's the opening line that guided my interpretation of the article. My understanding is that NYCR is not "taking their ball and going home," forsaking all competitions. My understanding is that NYCR is saying, "There has to be a better game." The second comment explains that the NYCR's own competition will be revamped to align with the views expressed in the article. The article, in contrast to the title, does not decry all competition; instead, it's asking for a new model. If it is indeed NYCR's intention to design a new and improved model for wine events, then I applaud the effort. I would much rather have a staff of innovators at work than a few more judges in the big, blind, medal-focused competition pool.

Wow, this has become some thread. It's a tribute to the depth/breadth of the original post, as well as to the fact that judgings remain a mystery to most people...even Grape Nuts. I think picking the "best" competitions depends in large part to one's opinions of varied methodologies and type/acumen of judges. And the key word here is indeed VARIED. They are all over the map. (Personally, I would put more stock in judgings where panels actually have to agree verbally on winners.)

I find it interesting that one of the competitions not even mentioned here so far (unless I missed it) is the so-called "Ultimate" ones. Organized by Paul Pacult and staged for the first time in 2010, they boasted of peerless judges, superior methodology, multiple rounds and even a way of converting results into the 100pt scale. SO far as I can tell, the results have made nary a ripple in the Wine Conversation. This perhaps shows that people really don't care about what goes on AT the judgings, but rather only what comes out of them, and the attention given to medals is more a matter of winners' PR than the selection of cream-of-the-crop wines. Paul got a bit hissy when I suggested simply that as sound as his motives were, the execution was about ten years too late. Outside geeky circles (like peeps reading this thread), most people just don't care about the machinations behind the medal-making.

As suggested by Evan, here is a link to the article I wrote on medals for the trade publication Beverage Media. http://ow.ly/2uEMt In particular, the article talks about how the usage of medals has evolved, and how diverse the competitions are, and my picks of a few comps to watch because they stand out as distinct (if note "better")

Felicia, I have a lot of respect for the judgings Robert Whitley organizes, but I disagree heartily with your bashing the current trend of "bashing competitions or the 100 Point Scale". The wine scene is more vibrant than ever, and yet there is hardly any controversy/debate at all about wine in mainstream media -- which makes them appear more inauthentic with each passing month. It is happening online -- here and on twitter and beyond -- precisely because there are many opinions on the topic. And I see no reason to accept the status quo in terms of how wine is judged, discussed and conceptually shared. Keep in mind that intelligent debates like this DO bring wine lovers together.

I'll start off by saying I urge you to take this with a grain of salt. I have neither the experience of NYCR nor the education (yet) of some discussing. But, hell, I have to put in my two cents.

ANY beverage reviewing, to me, is completely subjective. Even though Thomas certainly wishes for a clean slate, methodical, clinical approach to judging wine...it just can't be done. It can't be done for the same reason that I can't trade my heart for your heart, or my kidney for your kidney. We're all different people. A lot of people say "well, we have different palates". While true, what makes up a person's palate is their life experiences. Each of us have a different life, a different history and past. We like different foods, different wines, different music. It's what make us unique. These experiences related to taste help to form our palate. There have been some beverages I've reviewed that I just don't like because the flavor reminds me of a medicine I used to take as a child. But to someone else, it's a flavor they've never encountered and find it endearing. There can be no "uniform" method of review and critique. We all have our unique palate and the key to finding someone you trust to go to for wine advice is simply matching as many similar likes and dislikes between palates.

This is why I don't like huge tastings that take a culmination of many, many people's opinions and condense it down into one score or award. You are trying to condense so many individual preferences and tastes into just one unit. It's like trying to combine the physical and mental aspects of an entire stadium of people into one person that represents everyone. It just can't work. I applaud magazines / blogs/ etc. that have SEVERAL reviewers who willing voice their opinions for all to see. Condensing reviewers thoughts to one finality destroys the point of gathering those people to review in the first place. It should be more like the House of Representatives, not the Senate.

As for the aspect of blind tasting itself, I don't like it for personal reasons. I really, really enjoy learning about the wine in the bottle and the story around it. It also leaves quite a bit out of the picture. If I taste two wines/spirits/what have you blind with no prior knowledge...it's not fair in my eyes. What if one bottle is $20 and the other is $200? Sure, maybe I'd be inclined to like the $200 wine more. Perhaps. But what if the $20 drank just as well as the $200? In blind tasting, the value of a wine is often ignored just to go straight for the taste. If a wine tastes like a $60 wine and it's only $15, that should be mentioned somewhere. But more often than not, it isn't (or at least not specifically). This ties into the medals aspect. If I was a wine producer, I'd prefer a statement saying "drank like it was a $60 wine" over a gold/silver/bronze medal. From what I've seen, the blind tastings don't release full lists of what they reviewed on a particular day. If you have a $200 bottle win a gold medal but it's competitors are all $70 or $80, is that particularly fair? The vice versa is the same as well.

Tish - Thanks for the follow-up, and now you've got me googling Paul and his competition.

Ryan - You're right, to a point. Our own NYCR Wines of the Year is much less like a wine competition. We select a very small number of wines from just a handful of categories. We'll work blind tasting in as one useful component, but not as the only measure. Our format continues to evolve, and we promise to be open with readers about how we do it. You are absolutely correct to note that we'd love to be in the game, so to speak, if something more effective can be constructed. I told Lenn in an email, "I want to be proven wrong about this." We don't intend to condescend, and many smarter people than us are responding. That's what we hoped would happen.

We have doubts, though, about any large competition. That's because palate fatigue is so difficult to prevent, and most wines don't get the benefit of much time to open up or re-taste. But this discussion is ongoing, and we think everyone should be looking for something better. We'll sit out until we can see it constructed, because we feel our stance is more effective in the overall conversation than by staying involved.

Scott - We know your concerns. I would simply say that fixating on price is not a great idea - not that you do, but bottles are priced differently for myriad reasons. Ultimately the wines have to be evaluated for quality, and then consumers can decide if the dollars are worth spending. We all construct our own QPR, right?

Blind tasting and context are important issues - but the context of vintage, type, region etc are all (wisely) embraced by some wine competitions, allowing wines of same style, origin and age (and in some instances price) to be judged side by side. The results tend to be much more reliable and consistent. Looking at the label ain't the kind of context that historically produces the best results for consumers.

And competitions should be treated like critics - you're mad if you don't sift the good ones from the bad before taking advice and parting with your money.

Evan - It's true, bottles are priced differently for a myriad of reasons. However, bottom line is, a bottle is priced what it's priced. Whether the grower decides to minimize profit and release a wine that will bring acclaim for it's price point or decide to reach for the upper echelon of super-premium wine categories depends entirely on the maker. But what they charge is what they charge, regardless of the reasons behind it. I'm not saying fixate on price as in make it the entire focus of the competition but rather an introduction of a "best buy" category. In the spirits industry (where I focus mainly) the same problem with medals is quite readily apparent. I have tracked down the submissions from many competitions and tried several of them myself to compare to the results. There have been many times where, while the bottle didn't win a gold, silver, or even bronze...it was still a stellar spirit for it's price point.

I also don't mean to introduce the price points in EVERY competition either but, rather, the larger ones. It is a cold hard fact that there are those out there searching for a budget wine (myself included!). More often than not in these competitions I just can't afford the gold or silver wines. A "best buy" list would be beneficial for those of us who financially just can't drop $50 or $60 (hell, even $30 or $40) on a bottle of wine. Many magazines do it, why can't competitions? If we're condensing everyone's opinion into one judicial decision, why not condense everyone's QPR too? If you can trust the weight behind the medal, surely you can trust the weight behind economical sense?

"Even though Thomas certainly wishes for a clean slate, methodical, clinical approach to judging wine...it just can't be done."

Scott,

I posted plainly and clearly this to Evan: "I'm not saying that wines should be evaluated only for the technicals; I'm saying that those who evaluate wine ought to be formally trained to do so, and that a set of standards by which to gauge both the evaluation and the evaluator ought to be established."

"hose who evaluate wine ought to be formally trained to do so, and that a set of standards by which to gauge both the evaluation and the evaluator ought to be established."

That's...uh...exactly what I said. You want a list of rules and regulations in order to judge both "the evaluation" and "the evaluator". By whose definition should these benchmarks be made?

Let us assume that it's the person running the competition. What, really, makes them the most qualified to run the show? Do they have extensive experience in the field? Do they have an in-depth knowledge of everything wine? Well...says who? Shouldn't there be a standard for event officiates? It would keep just anyone from running a competition and scoring free wine (as Charles Massoud said.) So who will govern the event officials? Perhaps some sort of internationally recognized tasting event judiciary committee? This is a classic case of who watches the Watchmen? By who's set of standards will we be gauging this off of? I highly doubt (if these comments are any representation) that we'd ever unanimously agree on a definition.

Personally, I feel that "formal training" is something you do just to look good on paper. As a Chemical Engineer, I can guarantee you that everything you learned in undergraduate could be more effectively learned on the job. Yes, it's true that I'm going for a MSc in Brewing and Distilling Science but the core of the program I'm attending is hands-on practice. I've visited your website and checked out your resume. What is your definition of "formal training"? If it's attending accredited courses at a verified college than I'm sorry, you wouldn't qualify for your own definition. I don't mean this to be a slight but let's face it. From what you've told us you've garnered your expertise in the field. Even though you said:

"Maybe I'm a true cynic, but I don't believe that people are magically endowed with the skills to evaluate wine just because they drink the stuff."

Isn't that almost exactly what you've done? By implementing "formal training" there are so many passionate and vivacious bloggers who may not have the time or money to qualify for "formal training", yet they would be a valuable resource to a tasting panel regardless.

"I highly doubt (if these comments are any representation) that we'd ever unanimously agree on a definition."

Likely, but what the hell; I'm a reasonable guy, as long as it's my ideas that are attacked an not me...

In the context of this issue, my definition of "formal training" is training to develop the ability to perform to a standard that is set by the wine evaluation community, a standard that ought to include sensory proficiency on some identifiable level. What the standards are would be hammered out and agreed to by the community.

Do you seriously believe that we can will ourselves into the skills it takes to identify excessive v.a. or reduction? I've judged at competitions where each of those conditions (and others) were sometimes rewarded with a medal during the consensus stage of the judging, when judges talk about the divergence in scores.

As for my training, before I got into the wine business I took both sensory and winemaking courses at the original International Wine Center in NY. When I got started in the Finger Lakes wine business, I attended Cornell cooperative extension workshops to continue the learning. Now out of producing wine, I continue attending workshops when I need to know more regarding a subject I am commissioned to write about.

You certainly don't need a college degree to develop your sensory analysis skills, but I believe that you need more than passion.

As to this: "there are so many passionate and vivacious bloggers who may not have the time or money to qualify for "formal training..."

I'm not sure this is the subject we are talking about when we talk about standards for wine competitions. Please enlighten me. But anyway, wine judges generally aren't paid for their services, so it stands to reason that whomever wants them to judge could foot the bill for training them.

Having said all that, if wine competitions and reviews are nothing more than an extension of marketing then I agree, things should stay exactly as they are, because no one needs to hone sensory equipment to market wine.

I don't happen to think that wine reviews and evaluations should be an extension of marketing--I think they should serve the consumer instead of the producers.

Fair enough on the training. I'll give you that. The only thing that I'd be concerned about is that there are many out there whose only resource IS drinking a lot. While you and I (being in NYS) are lucky to have NYC and it's wealth of wine information. However, there aren't wine programs like those in the city all around the US or even around the world. I suppose that, yes, if you are lucky you can attend one but I'm sure there are many potential judges out there that just don't have the resources to do so.

As for the willpower to learn skills...yeah, I believe you can do it. It's the same for many things, not just wine tasting. Think of it, as you say, like education. There are students in which the lesson comes naturally to them. Then there are the students that, with work, will understand the material just as well as those that comprehend it naturally. While I may be overly optimistic, I feel that there are those that can and will learn to identify given the exposure and / or tutelage. Granted, there will be those that never will get it but give them a few tries at judging and the wheat will separate from the chaff in due time.

As far as the blogger mention, I throw it in because of the major movement of businesses to embrace social media. Since we are viewing this from the marketing aspect, we can't neglect the blogger judge in this day and age. While not all competitions have fully delved into the online media, it is only a matter of time. Even such prestigious beverage industries as scotch and cognac are breaching into the social media rings. It is inevitable that there will be a draw from these avenues for judging in the future due to a variety of benefits that would take far too long to explain here. Suffice to say, bloggers are becoming big due to their supposed "cheaper" (cost effective) word of mouth and viral advertising.

To your first paragraph: I understand that position, but aren't the centers of wine competitions either in major cities or at least within shooting distance of a local wine industry?

In any event, my feeling is that the organizers of competitions owe it all to require trained judges and since it would be to their standards, they should provide the training.

To your second paragraph: Surely, there are people in this world with innate talents and the ability to hone them almost on their own (provided they at least read so that they can place an identity on what it is they are honing). But it seems to me that something that claims to be providing a service or a recognition of achievement (a critical review or competition) should at the very least develop some set of standards to lend it credibility and to ensure that participants are working at the same approximate level of expertise.

I'll give you one outlandish for instance: in one competition, one female judge walks into the tasting room smelling like Estee Lauder's laboratory, and not one of the officials explains to her what's wrong with that practice. They could avoid the thing altogether if they just told everyone in advance that they don't want cosmetic smells filling the room because, obviously a surprise to that untrained woman, it confuses the judges' sensory equipment.

Your third paragraph: Don't get me started on social media ;)

It seems, however, that you concede that wine evaluation is mainly a marketing tool. If that's to be the case, then I'll continue to find little value in it, mainly because I don't trust marketing to make an effort to put forward the absolute truth. I should also say that I am no fan of aesthetic criticism; I simply don't understand why what someone else prefers should be important to my existence, and I don't think what I prefer is important to anyone else's existence, except maybe my wife, who is forced to live with me.

My aim, when I judge wine, is to award achievement--the wine's not mine--which leads to the original point of Lenn and Evan's opening salvo.

A few months ago I made a decision to stop judging at competitions. I came to this decision mainly because I got tired of being prodded into awarding more Gold Medals--and I mean actively lobbied to do so by those running the show.

Therefore, I agree with the NYCR position, although we come at it from a slightly different direction.

It's been nice having a reasoned chat with you. Not something that happens online with frequency.

Great article. And wow an amazing and very heated discussion which is always fun. I'm actually still making it through the last group of posts so my point may have already been made.

I think the big thing here that is pointed out has nothing to do with who may or may not be qualified to judge and evaluate/score wines. The important thing in my opinion is the background behind those results. There is no possible way imo to come up with a standard set of criteria for judging as has been suggested. It isn't possible to find a criteria that suits all involved and the millions of different palettes in the world. All can agree that wine is subjective. The only real standard I could think of is that the wine was made from grapes and is suitable for human consumption, is that grounds for a gold medal? Everything after will have some amount of subjectivity.

As a consumer what i find useful about scores/reviews is I know who is doing the evaluating. If I find that I tend to agree with Tanzer, Raynolds, Parker, GV, NYCR, Jancis whomever it may be then their opinion becomes useful to me. I know that I have agreed with their subjective viewpoint in the past and will likely usually agree in the future. It doesn't matter if they rate based on 100pts, 20pts, letters, badges, colors, or animal ratings. All the rating becomes is a reflection of their opinion which I can choose to use in my decisions or not. This is simply not something that can usually be applied to a medal competition. A competition just cannot set the baseline consistently to generate a cohesive viewpoint on which to evaluate their judging. And that's my point, I can judge critics/bloggers etc. as to how they may or may not agree with my tastes. I cannot do that with a competition medal.

I have been following the posts, but admit it is hard to keep up with all of the comments/questions. A little background, I have been in the retail, wholesale and educational sides of the wine industry for about 30 years, so a few people have asked for my opinion on wine competitions. I hope I was asked to comment because I will be honest and diplomatic, a little "if you can't say anything nice ..." Here goes.
How many of us have used Wine Spectator, Consumer Reports, Car & Driver, Zagats and other publications to help us decide something? They need subscribers/marketing/results to survive. Is any one review the end all? We look at the many suggestions and then make up our own minds. A competition doesn't MAKE you buy the "best wine", but is simply a suggestion. If I am looking to try something different or a wine to serve someone whose tastes are different than mine, I may use these results to assist me. Some competitions avoid naming a "Best in Show" though for exactly the reasons expressed in earlier posts. It is in our nature to try and pick "a best" though and it does seem to garner more attention. Most competitions are done blind to remove bias (producer, screwcap, etc.). It shouldn't be turned into an "American Idol" where the most beautiful bottle, label or winemaker affects or decision. Not everyone agrees with the final pick on "idol" either.
I have participated in many wine competitions over the past 15 years and I have NEVER seen any "Pay for Medal" or any behind the scenes shenanigans. Lenn said "... did I mention how much I have disliked the judging gigs I've done? It's exhausting, mouth-raving work and no where near as educational as sitting down with a wine (or even 15 to Evan's point) and spending several days with them." This would explain his opinion about competitions. There are many ways to run competitions (scoring systems, panels of 3 or 4, no panel discussion, only Riesling, 2 days, etc.), but I have never tried "several hundred wines" in one day. It is work though and requires great concentration. Efforts are made to avoid palate fatigue and may involve most of these - breaks, lunch, bread, neutral olives, rare beef. Using good glasses cuts down on nose fatigue too.
One opinion I read both this year and last is about how to improve a competition's experience/results. It was to make it more educational. A competition is not the place for education with lingering discussion, mini-meals or trips, due to time constraints and its original mission. Expenses are an issue for a majority of wine competitions. Discussion will occur if there is a discrepancy between panel members if there is disparity in the scores. Then we talk about what we may have missed the first time around. I have judged at many competitions over the years at the amateur, local, state, country and international level. I have judged with people from all over the world who are restaurant, retail and wholesale buyers, Masters of Wine, Master Sommeliers, winemakers, wine educators, wine writers, authors of wine books, and some actually run top wine competitions. As judges we are supposed to come in knowing how the competition is being run, the scoring system being used and how to evaluate good vs. flawed wines. We are to put our own personal preferences aside too. Judges are human and humans are looking at the results. There is no machine yet to run wine through for the exact score. Come to one of my classes and listen to what the students are loving - NYS wines like concords, fruits, meads and Rieslings, Arbor Mist, pricy Bordeaux, [yellow tail], malbec and more. Diversity. Who is anyone to tell someone else that they are wrong for liking something? The competitions are not telling anyone that they are wrong if they don't agree. But, I feel confident that a competition's Double Gold wine is a better made and better representation of that type than a wine that receives no medal.
Don't blame competitions for a tasting room employee's mistake or bad training in telling you a wine won a gold, not a bronze medal. Many people have stated that they are more influenced by what the employee says about the wine and/or the winery's reputation than the medal status.
Competition results can vary year to year especially in a cool-climate area like NYS. Wine is ever changing, but Mother Nature plays a big part in how those grapes come into the winery. NYS also has many regions spread out over many miles, so the Long Island AVA may be more affected by factors that don't affect Lake Erie AVA affect. This can also influence the end results in competitions where white wines do better than reds in given years.
There is also a lot of discussion about dry vs. sweet and different families of grapes. I have been exposed at these competitions to so many wines using grapes I have never seen before, much less planted in NYS. Aside from the thrill of encountering a new wine, we receive information about the grape so we know what to look for. As judges we will give the higher scores to a well-made wine that is a good representation of that grape, blend, etc. If you buy a bottle of Chianti, you expect it to look, smell and taste a certain way.
I use this example when I do tasting room training at the New York Wine & Culinary Center. When someone comes in looking for a wine from this state that tastes like their favorite Australian Shiraz, don't apologize that we can't produce that style. We will never be able to produce it given our growing conditions (global warming discussion?). This doesn't mean that NYS is a lousy agricultural producer if we can grow apples, but not pineapples.
If you look at competitions in Alsace, Australia, England and even California, you see native, hybrid, dry and sweet European grape wines (Riesling for example) winning. I judged recently at a competition where "the best" wine was a NYS Diamond. It was excellent and there wasn't any way it could have been better - balance, flavor, finish, etc. Was it the best of all the wines? There were many others that were great, but the majority of judges picked this wine. However, were any of the judges rushing out to have that wine that night or to put in their cellar? Maybe not consumed on a daily basis as often as some other varieties, but I would certainly serve it with some spicy Thai later. Or serve it to someone that enjoys sweet whites?
I rarely know and don't care when judging (unless it is within a price category) how much that particular wine costs. When I see the results, that is when I want to know. Pricing/quality are individual choices.
Finally as far as transparency, you can go to most competition websites and get as much information as you need - how to enter, what categories, entry forms and prices, deadlines, etc. You can often find the list of judges and their bios. Results are usually left up for years. There seems to be some direct hits to the NYW&GF and the New York Wine & Food Classic. I can honestly say my career would be different without their assistance. They are always there with answers. I provide their website link to my students for info about tours, industry updates and more. They help coordinate so many important industry events that people don't hear about (in D.C., Albany,viticulture, writers from England, industry leaders from Europe, trade shows in London, assisting wineries with getting their wines entered in other cpmetitions and so much more). How are NYS wines gaining international attention? The NYWGF is a large part of the recent exposure and success. I have participated in wine competitions as a backroom volunteer and as a judge. When I see how the NYSW&FC is run, one word comes to mind - Professional. They are doing all of this in spite of drastic cuts to their budget over the last two years.
I love wine. And I love the people I meet through a shared love. I am sure that this controversial topic will bring more people to your report. I just hope that your readers stay open to all that is out there - different grapes, new regions (China is coming!), scoring systems - because it is a great time to be a consumer. I plan on celebrating the great results of all the wine competitions at home, in class and other places (including one for the NYW&FC results to be held at Sheldrake Point next week) for years to come.

You covered a lot of ground, but one thing that stands out is your being convinced that a wine winning double gold is better than one not winning a medal at all.

What happens if at the next competition, the medals are reversed? That can (and no doubt does) happen, which is part of the problem.

Also, I'm not blaming a competition for a tasting room employee's mistake. That's just another example of how these medals can and do do consumers a disservice (in that case, the tasting room employee did as well).

It seems that we have several judges chiming in here, mostly defending competitions and their roles. We aren't attacking you personally or saying you're not doing your jobs the best way you know how. Other people are talking about training for judges in this thread, but we aren't. And it only makes sense that you'd defend these competitions -- you're part of the circuit.

What is important to remember here is that our stance really comes from the consumer perspective and so far, every non-industry person commenting here has said that we're onto something here.

This would be a much more meaningful and useful discussion if everyone would accept that at least some consumers think we're right and maybe we should work to improve the process rather than defend it and remain a part of the machine.

I was hoping my comments would be viewed as insight and explanations more than defense. There are many people who have never been to a wine competition. That's why I was hoping I could provide clarity and transparency to those who aren't in the industry. After every competition, the organizers ask for feedback on how to improve, so we can certainly agree that there can be improvement. If you have evidence of Double Golds being switched for No Medal wines, then let me know. I just haven't seen it or it was corrected before results were released. I don't want to throw the baby out with the bath water for "what ifs..." though. As far as being "part of the circuit" or "part of the machine", those sound pretty negative. I am honored to be asked to judge and feel that working hard over the years for my certifications has paid off. We aren't paid big bucks, but do these competitions for non-profit or charity groups. The last I checked, I am a consumer too.

Lorraine: Obviously I only touched upon a couple of your points, not your entire comment.

Yes, you're a consumer, but you're also a part of the wine industry. You're not the "average consumer" that so many talk about.

I don't mean "circuit" or "machine" in a negative way, so I hope they aren't taken that way.

You agree that there is room for improvement. Can you share some of the things you've asked to have improved? Have they been implemented the following year?

It would be great to get into some specifics.

I don't want to cite the winemaker publicly, but I once heard from a winemaker who makes wines not only for his employer but also for other wineries at the same facility. He once told me about two wines -- one from 'his' winery and one from a client winery -- that were actually the same exact wine inside different bottles/labels.

One wine won a Gold. The other didn't win anything.

It's easy to see how this could happen at a competition with various judges tasting different wines, but I think it's pretty illustrative.

I think the example you cite where the same wine (under two different labels) received an award in one instance and didn't receive an an award in the other is an obvious example of where a competition could be improved and suggests one of the criteria you might apply when deciding if a competition is "good" or not.

All of the Australian and NZ wine comps I know anything about assign one panel of judges to each class, so that it is a consistent group of people rating each set of wines. That said, in any blind tasting there are sometimes results like this (and if the labels are different--open-label tasting could conceivably yield different ratings for the same wine as well).

I found this read to be exceptional thoughtful and thorough considering the subject matter. Most either agree with or disagree with these competitions as a wholesale opinion, and I appreciate that the NYCR saw it valuable to go into great detail. And even more important that everyone has engaged in this dialogue openly.

I agree, at least partially, with most of these points. But I want to bring in a slightly different point not often considered in this discussion - that of amateur wine competitions. And how that context can be effectively applied to commercial competitions

The American Wine Society has a judging program (and awards competition) that is based on "comparative" judging. Yes, wines are judged blindly. But the wines are grouped by variety, location, style, etc. This is a simple but key difference from what NYCR has issue with above. This was, the factors that become global above, are isolated; and wines are compared to their "peers."

I believe that both the amateur and commercial competitions follow the same guidelines, but having only participated as an amateur, I hope I'm correct about this. (Since it is the key link to the NYCR view above.) In the amateur competition, the goal is to score each wine (and reward medals) based on how well the wine showcases the styles and expectations of that kind of wine. In other words, a California Cabernet and a Bordeaux would not be judged against each other, but in their own categories.

That alone is important, but to go further, the goal of the amateur competition is to provide guidelines for the producer to improve their product. While I was certainly disappointed to score one point shy of a bronze medal with my first submission, the scoring highlighted, very effectively, where my wine fell short of its "peers". So now, I can attempt to improve upon that effort in the future.

I totally agree that a wine should never be considered better/worse/otherwise by a consumer based on medals awarded. But, if used properly, the process can really be an exceptional tool for the producers.

And I'd personally hate to see judging disappear since my wife does it and we both really enjoy the tasting opportunities that they present!

While that example of the same wine under two labels is enticing, it really doesn't say anything unless you know that the wine was produced and processed exactly the same way, right down to the date and the same bottling. Anything can happen to cause a wide disparity of one wine treated in different ways.

Plus, you must know whether the wines appeared in the same flight (they might have, as they would be the same category) or whether they appeared at two separate panels of judges (which could indict one and not the other panel).

A better test is when the same wine, from the same lot is placed into the lineup twice at the same panel of judges and the results aren't close.

Joe,

The Aussie/NZ set up of panels that you mention seems like a good diea. I have never judged in a competition that did it that way.

I don't know of any wine competition that does not taste blind, that is, the wines are served in numbered glasses--judges never see bottles.

If there is a competition that is done from the bottles, well, good luck believing the results ;)

The last sentence in my comment above came across incorrectly. More accurately, as my wife trains for the AWS judging program, we are both enjoying the opportunity to taste lots of different wines we might not otherwise have tried on our own.

A quick addition to how a wine could be judged differently ... Thomas makes a great point. Same batch judged by the same panel would be ideal. I have done tastings of the same wine served at different temperatures, under different closures, varied storage conditions and other variables that can drastically influence how a wine tastes and scores.
We are usually instructed to judge the wines on how they taste in the glass today, not how they might taste in the future. Some might view this as unfair to blockbuster wines that could improve, but a consumer doesn't see "this wine will taste like a DG in five years" next to the medal. That is just another challenge in being a good judge.
As Joe mentioned, one panel trying all the wines within one category has been done in Aus/NZ and it has also been applied to some domestic competitions. I have also seen at larger competitions, "expert" panels assigned to name the "Best of" wines after other panels have advanced DG/Golds to the next level. There is simply not enough wine and time to have everyone in the room try them.
Jared talks about amateur and commercial judging standards being similar and they are. I am going into my 3rd year of the American Wine Society's Judge Training Program. I am, like many competition judges, not certified but the AWS program offers great instruction on how to be a better judge with their system at their own competition. Being a certified wine judge isn't the only qualification to be a good judge and there are few places to become certified. I chose to become involved exactly for what many of you mention - examining how to improve a process. Not every competition works the same way for a variety of reasons and I am glad that I can provide some insight after experiencing many of them. Some even allow guest judges to sit in to see how the process works, Transparency.
I know I would rather view the glass as half full than half empty. I am sure Evan can confirm that about me (next round is one you, Buddy). I hope since many of the judges who have commented here have attended different competitions over many years that our experiences can be viewed as truthful and not part of a conspiricy to protect what already exists. I don't know of any competition that wouldn't listen to recommendations, but be open to the fact that there may be reasons the suggestions can't be implemented. We can certainly agree to disagree on some opinions, but facts need to be examined as well.

Unfortunately, this post (and others) got sidetracked by a glitch in software, and I know other people were ready to post more comments.

Before the glitch blocked me out of the comments I was going to comment that I wonder if anyone noticed how many years of training Lorraine Hems has taken on in order to feel that she can be a competent, knowledgeable wine evaluator.

I don't care how much wine we drink, we don't know wine until we study it in depth, from all its angles, and, sorry to say, that includes the technical aspects as well.

This is a very interesting conversation and I guess I have got to weigh in. and I hope I don’t burn bridges with all the writers and reviewers out there. But you know me lightening rod Scott!
After 20 years of entering wines into competitions I do know one thing for sure, a wine that is not worthy will not win a gold medal no matter how many competitions you enter it in. A wine that consistently wines Silvers will probably win gold but a wine that wins only Bronze or no award will never make it to the top. So I think it irresponsible for owners to make such an off hand comment suggesting “any wine will eventually win a gold medal” knowing it not to be true.
As a winery owner I view competitions as a way for me to get a good idea of the quality of my wines. As a small winery I don’t have the time or the money to have focus groups of consumer’s evaluating my wines. So I use competitions to judge whether or not my wines are of commercial quality and also as an indicator of how good they are. In my opinion the main problem with most wineries is the fact they have a house palate and don’t really know if their wines are commercially acceptable. I keep hearing from winery owners that they never win any medals and they think their wines are great but in reality they are not. If it doesn’t win a medal there is a problem and they should look at what they are doing in the cellar or vineyard to improve their practices so they can win a medal. If it wins gold and a bunch of silvers and a bronze I know I have a good wine and we have done the right thing in the vineyard and in the cellar. If one of my wines ever wins only bronze medals and a no award I know we have a problem and need to go back and look to see what happened and why is it getting such low scores. There are some very good competitions out there where the majority of judges are well trained and know what they are doing. I don’t think judges should be just any Doctor Tom, Dick, or Jane who collects Bordeaux and Burgundy’s. They should have some wine skill set, as a wine maker, wine seller, sommelier, or actual accreditation as a wine judge. As producers we have to understand our consumer and make the wines they want and then make wines we want to drink and above all make them well. The one thing competitions do is set a benchmark for quality.

Now let’s not idolize the Europeans. They have just as many competitions as we do and use a wide range of different occupations as judges just as we do. So no competition has all perfectly trained judges in winemaking, sales, and reviewing but usually have a sprinkling of all. Competitions are the same all over the world some are better then others in their selection of judges and it is up to the producer to choose which one they want to participate in.

Now my beef with reviewers is how many of them who think just because they can write they can pass judgment on a particular wine. They reviewers never say up front that the only experience they have is they drink a lot of wine and went to France or California wine tasting, so now, they think they can review wine. Well sure they can review a wine but are they being accurate? How many wines does a reviewer need to taste before he or she actually knows what they are talking about? How many winemakers should they speak with and how many wine making seminars should they attend before they are qualified to review wines? How many wines and how many different foods do they need to experience before they are qualified to review a wine? Right now none! How many reviewers really know by tasting whether the wine was in French or American oak barrels or oak chips? But they will make that observation even though they have no idea. How many reviewers actually call the winemaker up and ask how the wine was made? Good journalism requires that a writer document the facts. I know we get very few and I mean less then 5 writers who call for a clarification on types of oak, fermentation temps etc so they, the writer, can be accurate. I mean how many writers and now with the use of the internet, bloggers really know how an oak barrel is made and what actually happens in the barrel to the wine and why you put wine in oak barrels. Or how many reviewers really understand that wine makers in warm climates add acid and in cool climates add sugar? Or that cool climate wines are going to be different then warm climate wines? Is it important? Yes it is! How many reviewers actually know what Pinot Noir tastes like? How many reviewers say lower tonnage is better then higher tonnage? Who has determined that? If they did their research they would know that it is all relative. Low tonnage could be 7 tons an acre if the soil and vine spacing warranted it. But many reviewers say if you have more then two tons per acre you are over cropping. They are wrong and they are spreading a misconception to the consumer that is wrong. How many reviewers actually know when a wine is corked? Or slightly corked? How many reviewers actually have taken a tasting and aroma class so they can identify what is going on in the wine? So they know what flavors are caused by fermentation and what are not.
There are a number of levels of education and diplomas available to reviewers but most of them don’t take them. What I feel is wrong is someone who has so little experience and nothing to loose can pass judgment on a wine or winery with out thought or care of the consequences a bad review might have on the winery, the family of the owner, the winery’s employees, and the winery’s business. I think before a reviewer can begin to pass judgment they need to have a minimum required number of wine diplomas whether it is a Master of Wine, or a minimum level two certification from the Wine and Spirit Education Trust, or a Master Sommelier diploma. I think the wine writer/reviewer/blogger community also needs to get its house in order before they start passing judgment on competitions and wines. Maybe now is the time for the wine industry and the writers to come up with minimum certifications and qualifications for a reviewer to have before they can write reviews.

Scott: Thank you for taking the time -- as harvest is underway -- to comment and share your thoughts.

Obviously we don't necessarily agree with you, but what fun would wine (and life) be if we all agreed on everything?

I find it interesting that you use competitions to judge if your wines are any good. I know that you (and your winemaking team) taste a lot of wines from the world over, so I'd think house palate isn't a concern and that you could decide for yourselves if a wine is good or not.

As much as I agree with much of what you say about reviewers/bloggers, I absolutely do not agree that reviewers need any sort of certification or diploma to review wines.

I can only speak for myself of course, but I always ask for information about how a wine is made before I review it and I often email or talk to the winemaker before publishing a review. I can recognize cork taint. I can definitely tell the difference between American, French and even Hungarian oak much of the time. I'm well aware of the additions that can be made in cool and warm climates. Acid is added to wines here on Long Island too, btw.

I know that, despite my lack of formal accreditation, I've seen at least a couple of my reviews published in your tasting room, by the way.

Your wines are going to be reviewed by people with a variety of experience -- on blogs, in magazines and on sites like Snooth and CellarTracker. Those are your customers too, remember, not just professionals.

Thanks for sharing your thoughts, and let's all agree on this: No one is burning any bridges, and we all benefit from a respectful discussion. You've always been open, and welcoming, and helpful to writers like Lenn and myself. If you and I happen to disagree on certain things, that only makes things more interesting. But I will never lose my respect for the way you run your business and the integrity you have, and I hope that as long as Lenn and I and the NYCR staff continue to work hard and in a fair and thorough manner, you'll be happy to agree to disagree with us on occasion.

I couldn't agree more when you say that if a wine doesn't win silver or gold, and is consistently getting bronze or no medal, there is almost certainly something very wrong.

And out of all the comments dissenting with our position, this was one of the most productive. That's because I think it's healthy to have a discussion over what standards we should have judges in wine competitions. What do others think? What should the standards be?

As for reviewing wine, or writing about wine, that's entirely different than judging in competitions and I share Lenn's views on that.

I feel that for many wineries there is room for both feedback from wine competitions and feedback from bloggers, reviewers,etc. I will continue to send wine to the NYCR and wine comps and also national print media. I feel that all have a place in our marketing of our products. I think indiviually all of these reviews can be helpful. We try to evaluate our wine thru blind tasting ourselves and evaluate our wine within the context of the finger lakes as well as other viticultural areas. I always get excited if someone enjoys a wine but try to not take any of these results overly seriously. For myself I trust my own palate to tell me if a wine is very good. Unfortunately many wine buyers prefer the unbiased opinions of a third party. In any event we will continue shipping wine all over the country. Evan and Lenn I enjoy the dialogue you guys opened up. Anybody who has watched Evan on the news know that he is not afraid to (stir the drink) so to speak. That is a complement,Evan.

It always struck me that everyone seems to think that the comsumer needs a medal to buy a wine. When was the last time you bought catsup,yogurt,relish,peanut butter or frankly any other consumable product because it won a medal. I think that the consumer is smart enough to make their own decision on purchases and if you produce a gold medal Chardonnay it doesn't mean the consumer will like it. It all comes down to individual tastes and everyones' taste is different.

What you say is true. I often wonder why so many people think they have an automatic right to tell the rest of us what is good or bad about wine, but few do it with other food products. That's why I believe that a review and/or a competition (a word I dislike as the definition of the event) should be closely tied to the merits of the wine rather than the merits of an opinion.

Believe it or not, people can--and are--trained to put aside bias and use their sensory equipment plus their knowledge to evaluate wine for its individual merit/quality achievement. Untrained people simply have to work harder to figure out how to separate bias from evaluation, and many fail at it.

The overarching concept behind a wine "competition" is billed as a place where judges put aside their biases and also judge a wine by its performance on that day alone. Some of us often have to judge wines that we would never drink, but we can do it, and fairly, because of our training.

As someone who has judged competitions for many years and run a regional competition for a decade, I'm both amused and mystified by this article.
In my view, it starts with an incorrect premise and then goes wildly off track from there. Who decided that the purpose of wine competitions was to deliver valuable information to consumers? Other than Tasters Guild, I cannot think of any.
Rather, competitions have always been trade shows. They emerged from the old model of the county and state fair systems in which agricultural products - be it pork, beef, wine, corn or whatever - were to be judged, not for the benefit of the "consumer," but for those who made the product. A jury of their peers determined the best product entered in that fair.
Fast forward to 2010 and the mission is not that different, though the technology has improved and the refinements in both winemaking and wine judging have changed over the years. But when we judge, we're certainly not sending a message to consumers, we're telling wineries and winemakers they did a good or a bad job. Those judgments are based on technical flaws or success in the wine making, varietal correctness and unique characteristics that can be expected in a wine from local soil and climate.
So I think that your premise in the article that the "consumer" is or should be the ultimate served by wine competitions is very much off the mark, and shows a deficit of understanding of why and how these competitions were created and continue to exist.
You certainly have a right to disagree with those “principles” and to call for change. But you cannot simply apply what you think they ought to be to what is, which is what you have done.
Sorry, but this naïve is Don Quixote - Sancho Panza wine writing, a vini windmill tilting. You have seen the monster, and you slew it. Congratulations. Is anybody looking?
Another point, the very scenarios you describe at a wine competition are not even close to any that I have witnessed over many years of judging. Most good competitions, not all are good, struggle year to year to achieve a balance of judges on each panel. It's an ongoing challenge and we adjust constantly in an effort to get better, fairer and more balanced judging, just like Fox News!
Every judge with whom I have worked over the years has a basic knowledge of at what point Btettanmyoces - one of the elements brought up in your article - is acceptable in a wine and when it flips over the edge and becomes a flaw. They also understand ethyl acetate, volatile acidity, esters, terpenes, chaptlization, ph levels, brix levels at harvesting, and they can spot a stuck fermentation (Oh, yes, you’d be surpised how often they show up in competition!) They can spot other lesser known or understood factors in a wine such as rotundone or filter pad leeching. Or, oak chipping versus barrel aging and how and when acidity is either a merit or hinder a wine.
But whatever the case, the point is that we ask many skills and abilities of judges in competitions, and for the most part, they deliver.
It is a mistake also to suggest that advanced wine degrees make better judges. Experience and palate count for a lot more than parchment. I know MWs who have passed the exam, but do not make good judges. I know others who are great judges. I know people with no formal training who are both great winemakers and terrific judges. I know some judges who are superb at Riesling but whom I would not trust with sparkling wine or Chardonnay. It's all a very carefully orchestrated process.
Ultimately, the question comes down to this: So, you don’t think wine competitions have are a good way of measuring wine. So, what do you suggest is a better way than blind tasting?
Unless you can answer that I don't see how you criticism accounts for much. If daming all competitions is how you improve something, then we won’t miss you much anyway.
Christopher Cook
Wine and Food Writer, HOUR Detroit
Superintendent , Michigan Wine + Spirits Competition.

As someone who has judged competitions for many years and run a regional competition for a decade, I'm both amused and mystified by this article.
In my view, it starts with an incorrect premise and then goes wildly off track from there. Who decided that the purpose of wine competitions was to deliver valuable information to consumers? Other than Tasters Guild, I cannot think of any.
Rather, competitions have always been trade shows. They emerged from the old model of the county and state fair systems in which agricultural products - be it pork, beef, wine, corn or whatever - were to be judged, not for the benefit of the "consumer," but for those who made the product. A jury of their peers determined the best product entered in that fair.
Fast forward to 2010 and the mission is not that different, though the technology has improved and the refinements in both winemaking and wine judging have changed over the years. But when we judge, we're certainly not sending a message to consumers, we're telling wineries and winemakers they did a good or a bad job. Those judgments are based on technical flaws or success in the wine making, varietal correctness and unique characteristics that can be expected in a wine from local soil and climate.
So I think that your premise in the article that the "consumer" is or should be the ultimate served by wine competitions is very much off the mark, and shows a deficit of understanding of why and how these competitions were created and continue to exist.
You certainly have a right to disagree with those “principles” and to call for change. But you cannot simply apply what you think they ought to be to what is, which is what you have done.
Sorry, but this naïve is Don Quixote - Sancho Panza wine writing, a vini windmill tilting. You have seen the monster, and you slew it. Congratulations. Is anybody looking?
Another point, the very scenarios you describe at a wine competition are not even close to any that I have witnessed over many years of judging. Most good competitions, not all are good, struggle year to year to achieve a balance of judges on each panel. It's an ongoing challenge and we adjust constantly in an effort to get better, fairer and more balanced judging, just like Fox News!
Every judge with whom I have worked over the years has a basic knowledge of at what point Btettanmyoces - one of the elements brought up in your article - is acceptable in a wine and when it flips over the edge and becomes a flaw. They also understand ethyl acetate, volatile acidity, esters, terpenes, chaptlization, ph levels, brix levels at harvesting, and they can spot a stuck fermentation (Oh, yes, you’d be surpised how often they show up in competition!) They can spot other lesser known or understood factors in a wine such as rotundone or filter pad leeching. Or, oak chipping versus barrel aging and how and when acidity is either a merit or hinder a wine.
But whatever the case, the point is that we ask many skills and abilities of judges in competitions, and for the most part, they deliver.
It is a mistake also to suggest that advanced wine degrees make better judges. Experience and palate count for a lot more than parchment. I know MWs who have passed the exam, but do not make good judges. I know others who are great judges. I know people with no formal training who are both great winemakers and terrific judges. I know some judges who are superb at Riesling but whom I would not trust with sparkling wine or Chardonnay. It's all a very carefully orchestrated process.
Ultimately, the question comes down to this: So, you don’t think wine competitions have are a good way of measuring wine. So, what do you suggest is a better way than blind tasting?
Unless you can answer that I don't see how you criticism accounts for much. If daming all competitions is how you improve something, then we won’t miss you much anyway.
Christopher Cook
Wine and Food Writer, HOUR Detroit
Superintendent , Michigan Wine + Spirits Competition.

Still, I have to say that in the past few competitions that I have judged at, both in and out of NY State, the bent seemed to lean definitely toward promotion rather than merit achievement. Especially when we are gently chided for not awarding enough Golds--not enough of them and wineries will stop submitting, etc., etc.

It's always been my position that promotion was up to the wineries, but that doesn't seem to be what's going on, as many competitions now have press releases going out to the media.

In any case, what do we expect from a culture of hucksters, to which so-called social media has become just another limb???

Respectfully, I disagree with almost everything you write. I appreciate the feedback, and I'll be specific.

You write: "Competitions have always been trade shows. They emerged from the old model of the county and state fair systems in which agricultural products - be it pork, beef, wine, corn or whatever - were to be judged, not for the benefit of the "consumer," but for those who made the product. A jury of their peers determined the best product entered in that fair. "

My take: In today's marketing world, this is one heck of a myopic view that fails to take into account the result for consumers. If you plainly don't care about consumers and how they are misled, that's fine. That's up to you. But even if what you say is correct - that wine competitions are only trade shows not aimed at assisting consumers - then it is still fair to analyze the results that follow. And what follows wine competition results? A barrage of marketing to consumers who are utterly clueless about the meaning, nature, and quality of competitions.

We're writers who seek clarity and transparency for consumers. You run a competition. Our jobs are different. But I'll go further and disagree with your point that these are simply trade shows not aimed at educating consumers, because I've heard it over and over again! Look at the way competitions describe the medal winners. They instantly spin the winners into the titans of their field, leveraging it for marketing. I'm baffled as to how you can miss this.

You write: "You have seen the monster, and you slew it. Congratulations. Is anybody looking?"

My take: Your snarky, dismissive tone doesn't serve you well, because the answer is yes, a whole hell of a lot of people are looking. The response has been massive and overwhelming to our position. Even more telling is the private correspondence from wine judges who tell us that they agree wholeheartedly but can not say so publicly. Dismiss them if you will; again, that's up to you. And then there are the wineries and consumers who have cheered our position. You don't have to listen to them, but we're heartened by their support.

You write: "Every judge with whom I have worked over the years has a basic knowledge of at what point Btettanmyoces - one of the elements brought up in your article - is acceptable in a wine and when it flips over the edge and becomes a flaw."

My take: This is a bizarre statement, considering just how little agreement there is regarding whether Brett is desirable at all. If you've never met one judge who thinks Brett is a fatal flaw with no acceptability, well, I can only say I'm shocked by that. Brett tolerance itself is wildly divergent from person to person. I would generally say that most tasters don't mind some Brett, though of course Brett takes on many manifestations. But to say there is some baseline where Brett goes from "acceptable" or even "pleasant" to "unacceptable," I don't find this possible. I confess, though, that I don't run wine competitions and I'd be happy to be enlightened with more detail on this point.

You write: "It is a mistake also to suggest that advanced wine degrees make better judges. Experience and palate count for a lot more than parchment. I know MWs who have passed the exam, but do not make good judges. I know others who are great judges. I know people with no formal training who are both great winemakers and terrific judges. I know some judges who are superb at Riesling but whom I would not trust with sparkling wine or Chardonnay. It's all a very carefully orchestrated process."

My take: I think these are important points, and I'm glad you bring them up. One of the most common replies we've had in the past week is that many people agree, but they think it's possible to improve competition standards. Many people have suggested creating uniform judging requirements to weed out bad judges. Your point above, however, indicates that this might be a mistake. I appreciate you bringing this up, because I think this is a vital part of the discussion going forward.

If these competitions are essentially just trade shows, not aimed at consumers, then why do they always result in a Stentorian press release upon completion? Isn't the goal, with a press release, to inform the media so the media will inform consumers?

And further, if the purpose is only to evaluate the wines on their technical merits so as to provide recognition to wineries from the industry, why award Best In Class or Best of Show? Why not just award double gold or gold or silver or bronze? Aren't the Best Of awards similarly designed to increase sales? It would seem to me that a gold medal is a gold medal, and if the goal was to provide feedback to wineries, that would be plenty sufficient.

With this, you have lurched uncontrollably into something dear to my heart.

"And further, if the purpose is only to evaluate the wines on their technical merits so as to provide recognition to wineries from the industry, why award Best In Class or Best of Show? Why not just award double gold or gold or silver or bronze?"

It is devilishly difficult to score better than a Double Gold, since that requires unanimous Gold from judges. So how can any wine be any better than that?

The answer: the Best of categories are about preference and they are for promotional purposes, because along with hucksterism, America just loves a celebrity.

Re, Brett: I think I brought this up a long time ago but it has receded from memory here. The problem behind Brett is not how little or how much there is; that is again a matter of preference, at the moment.

If a wine shows Brett, then the winery may be infected.

If the wine shows Brett, and the pH is high, don't place bets on its future, whether or not you prefer that level of Brett.

Tom,
But why do we as the wine industry put so much hype around this product? So much so that people are afraid to buy it? "Oh which wine to buy, red or white, dry or sweet,what serving temp, what shaped glass, what are we eating? Oh lets just get a six pack of beer." On my trips through Europe wine was wine was wine. It wasn't held up to a point of being "you don't know enough about this to appreciate it" kind of attitude that seems to be pervasive in the US industry and the per capita consumption is much high there than here. Go figure.

I suppose it might have something to do with a certain lack of, what, depth of culture--I don't know.

One thing that makes me wonder is the geeky concept of wine as a hobby. I can think of many things that make good hobbies, but drinking and collecting wine isn't even on the list (making it is).

Last night at a wine event one of the local winemakers and I were talking about which metals are left after Gold and Double Gold lose their power. He told me that one competition is now at Platinum. I suggested Selenium, and then we went off on a list of metals--it's nearly endless. The opposite of metal fatigue is metal inflation.

Tonight, I have a chicken and pork on the smoker. We will pair it with Heron Hill Riesling (my wife works at their tasting room). Who cares what anyone else thinks about the pairing but my wife and I? And I am sure it will merit a medal of some sort, but I'm fresh out of metal.

Tom,
we stopped entering wine competitions because we believe that after 300 medals what's the gain? We tell folks that they are the ultimate connoisseur because they are the buying it and they are drinking it. I was quoted once, before we had entered our first competition in '92, as saying "Our medals don't hang on the wall, they go to the bank." There is a reason why wineries grow,it's because they produce wines their customers buy. I have yet to see wine judges supporting the sales of a winery.

In your manifesto announcing, in the first paragraph, that “we are done judging big, blind, medal-focused wine competitions” you declare that transparency is a central plank in the New York Cork Report’s platform.

My month-by-month on-and-off tracking of the blog’s content generally finds this ideal at work, though there are lapses that at least Evan, a seasoned media pro, should have noticed. Put that aside.

Your announcement that Cork Report “editors and writers . . . will not accept invitations to judge wines at large-scale, blind-tasting events with the goal to hand out ‘medals’ to ‘winning’ wineries” might seem impressive -- at first. Then, on reflection, it might seem grandiose. Why? Well, however many editors and writers constitute the team, they seem for the most part fairly invisible. The day-by-day Cork Report pretty much boils down to a two-writer operation: Lenn and Evan.

So in the interest of transparency, please tell readers how many editors and writers you have, who by name they are and if each and every one of them has indeed individually taken the nonparticipation pledge. For example, your language would seem to encompass Tom Mansell, at Cornell. Does it? Nondisclosure of this body of basic information, or hedging, might well place a question mark above the transparency claim.

When individuals boldly issue manifestos of intent, the world commonly and
justifiably assumes that a sizeable body of hands-on experience accompanied by extended pro-and-con thought has brought them to their positions. What they say in public is met in private by listeners who may ask themselves: “Why should I think they know what they are talking about? And why should I care?”

A careful reading of the Cork Report policy paper tells us next to nothing about the details of Lenn's and Evan's judging backgrounds. We know only that Evan had a “first wine judging competition” and that he won’t name it. We know nothing about Lenn’s judging experience except, in readers’ comments, that he has done “judging gigs.”

So that readers may evaluate the rootedness in bedrock reality of your policy paper, would you both please disclose your full judging experience, naming each contest and the year you participated? There would be no point in keeping this information off the record: sponsors of such events who see the Cork Report know who you are and whom you might have in mind, and they are big boys and girls who know about publicly taking it on the chin and coming up off the canvas.

If you decline to tell us where you’ve both judged, invoking a privacy standard, you could be seen as sinking your own ship, though you may stand on the bridge and continue to elaborate your new operating principles over the public-address system.

Nothing I’ve said here is intended as snarky. Nothing of what I say here should be construed as raising questions about the substantive matters that your declaration raises; they are manifold, significant and problematic. It is simply to learn more about your credentials for raising them. If the credentials are perceived as thin, perhaps your declaration may be viewed as publicity-seeking.

Look at it this way: the pro-consumer general principles you advocate can easily, plausibly, persuasively be presented by smart law students at a mock trial. But if I needed a lawyer for a real-world nuts-and-bolts trial, I’d hire only someone who has often been around the block. Have you both been? If so, kindly document that trip.