Welcome. You are currently viewing the forum as a guest which does not give you access to all the great features at GolfWRX such as viewing all the images, interacting with members, access to all forums and eligiblility to win free giveaways. Registration is fast, simple and absolutely free. Create a FREE GolfWRX account here.

Is it just me or do the leading OEMs inevitable get Gold in the woods and irons categories? I feel like the value of that label has been severely diluted. Not to mention that Golds are doled out so easily (e.g. hybrids category)

For ME the "Hot List" has lost some of it's credibility meaning. After so many years it has become obvious the money (sic advertising dollars) is a major factor in the whole concept.
Responsible informative journalism has taken second seat.
I actually quit reading the propaganda and go see for myself what I like. Either I like it or I don't.

The overall complaint year in is that advertising dollars affect the results. GolfDigest then answer that it doesn't and that thousands of hours were spent testing and thousands of shots were taken. The problem is even if you take them at their word that advertising dollars don't affect the outcome, there are never surprises and you don't really learn anything new about any of the clubs.

Sure Taylormade has the reputation of unlimited marketing budget, and they get GOLD every year. But in reality, they produce good equipment and probably deserve good marks. But when the magazine test 13 brands and gives 9 a GOLD medal it kind of defeats the whole purpose. It also makes what should be second place silver look like the clubs that belong in the clearance isle. Its impossible to distinguish what makes an R1 different from Callaway Extreme. They'll both get gold medals, maybe 4 1/2 or five stars for everything.

If Golf Digest really does spend the hours testing that they claim with testers of all abilities and ages and thousands of shots, they do an absolute horrible job taking those results and turning them into real information that golf fans can use. Instead of 500 hours of testing, we get 4 1/2 stars on performance and a comment "really takes off, feels like the face is really hot."

You could take a HotList from 2006 and just replace the equipment and pics but not change any of the results or comments and no one would know the difference.

I'd simply like to see a complete and thorough listing of the different products offered from the different OEM's....I think people are tired of the same predictable results...based on the same tired methods...I may be missing something, but I honestly don't put a whole lot of stock in it...

Usually the "demand" factor brings down a product, which is unfortunate because Byron Morgan putters are more than gold worthy.

Very true, and that's one of the biggest issues with this list... how on earth are supposed to accurately rate demand on clubs that haven't even been released yet? It is absolutely impossible to tell, as what they deem to be "hot" may turn out to be a dud, and clubs that initially go under the radar can end up being big sellers purely due to superior performance.

Demand and cost shouldn't come in to it... it should all be about looks, feel, sound and most importantly performance. It should also be totally independent.

Don't remember if it was Golf or Golf Digest but a few years agao they had Hotstixx test all the clubs and they gave yardage, accuracy, etc. Lasted I think 1 year then gone. Both magazines tests are a joke.

The flip side of the over-reward criticism is that most equipment on the market now is actually pretty good for most recreational golfers who are not as picky as many of us on these boards, so in some way, GD can justify having so many "award winners".

I never understood why people got so up in arms about it, GD is just one publication and one source of reviews on equipment, it was never supposed to be the only source for club reviews and ratings. The best way to find something that works is to try it, not go by what a magazine or internet board says. I guess I've long accepted the fact that most major publications are never really going to have perfectly objective or even useful reviews for that matter. They are all pretty much just bathroom reading these days.

Welp, that just saved me from buying this edition at the newstand.
Gone are the days of several testers providing feedback of pros and cons. Just a simple list with gold and silver stars.
Also noticed they pushed it back a month this year so Callaway, Nike, and Taylormade could get the new line in there in time...didn't TM miss out last year?

The GD hotlist proves once again that there is little ( if any ) difference in performance among the top brands. Unless people want the lists to be totally subjective how could GD not give gold ratings to almost everything?

It's pretty funny how "2nd tier" companies like Cleveland and Cobra end up with the majority of Silver badges.

That struck me also. I remember thinking that in the past as well, especially regarding Cleveland. I work in a golf store and get the oppurtunity to really try out different clubs and often over a long time and of course I dont like every model that cleveland produces, but thats more a preference thing, my opinion regarding Cleveland is that they make really classy products!

One other thing, why not comment on the products that didnt make the list. If they think that some model is crap, why dont tell us! Its just bad consumer journalism. Its not what I call a test. In a sense everybody wins, because every product at least gets a silver medal.... We need golf magazines that have a critical angle to the sport and the equipment.

Is it just me or do the leading OEMs inevitable get Gold in the woods and irons categories? I feel like the value of that label has been severely diluted. Not to mention that Golds are doled out so easily (e.g. hybrids category)

+1

Seems like if you are an OEM with advertising space on Golf Digest you get a gold rating. Everything can't be a gold winner. These ratings are diluted and useless.

The flip side of the over-reward criticism is that most equipment on the market now is actually pretty good for most recreational golfers who are not as picky as many of us on these boards, so in some way, GD can justify having so many "award winners".

I never understood why people got so up in arms about it, GD is just one publication and one source of reviews on equipment, it was never supposed to be the only source for club reviews and ratings. The best way to find something that works is to try it, not go by what a magazine or internet board says. I guess I've long accepted the fact that most major publications are never really going to have perfectly objective or even useful reviews for that matter. They are all pretty much just bathroom reading these days.

The thing is that The hot list is a very powerful voice in the golfing world! The companies use it very much in their advertising. Shouldnt the list then be more analytical and more critical? I think its sad the test is so vague. No real explanation why some club gets gold, and some silver...