Why the 'Hot List' isn't very hot anymore

Did you see this golf club review?

2 posts / 0 new
Last post
Scott Rushing
Scott Rushing's picture
Why the 'Hot List' isn't very hot anymore

You probably have already received or read this year version of Golf Digest's Hot List, their annual look at top golf club equipment from manufacturers. Over the years, I have anticipated the release of this issue because, like most avid golfers, buying new equipment is something I never have a hard time justifying and what better way to narrow down the list of possible choices than by reading real-life feedback from golfers like me? But the last few years, I've grown a bit more wary of the results, not because of the testers, but because of the process. And here's why.

If you look at the rating scale, each club is rated on a 100-pt scale with 45% of the score going to performance, 30% to innovation, 20% to look/feel and sound, and 5% to demand. So the first piece of constructive criticism I have is that, as a golfer looking for golf clubs, I don't really care about the innovation element. Well, ok, maybe I can acquiesce to say 5% rating value being used for innovation, but mostly I care about how it performs and how it can help "my" kind of golfer. I would value look, feel and sound higher than innovation because at the end of the day, I really don't care if the club is that innovative. I would keep using the Callaway Big Bertha Original if I got the ball to the hole in less strokes. What matters to me is performance. So drop - or at least restructure - the value of innovation because it's clearly not almost as important as the performance of the club, but yet it's only 15% less value. And how does one measure innovation? I'm reminded of the iPhone 5 commercial that Samsung did where the guy in line waiting to buy says "Yeah, they moved the headphone jack to the BOTTOM of the phone...ewwwwww". It isn't very clear who actually assesses this element of the club. Some of the testing involved a category of scientists with Ph.Ds and if they were tasked with assessing that element, maybe you get some level of value. But ask most golfers and we're probably focusing on adjustability (The weight slide on the Taylormade SLDR is cool..but is it innovative?), or where they position the weights. And just because someone does do something new, doesn't make it innovative....or good. Just ask the makers of The Cayman Ball or the Bullet Hollow Point Driver.

And what does demand for the club have to do with how it can help my golf game? Nothing. That's just another 5% of the score going to the big guys with the large displays at your local golf store. Drop it.

If you look at the results, it's clear you might as well stop reading, go into your local golf store and find whatever big display catches your fancy and buy something. 95% chance it'll be Gold rated. If you look a the list, the number of clubs in the Gold list is far too high. If everything's getting a Gold rating, then to me, the list isn't useful any more. I can drive over to the store, pick out a club from a top manufacturer and start swinging. I guess with everything being Golf, it does save me time and $2 from not having to buy and read that issue.

But seriously, if the results have evolved to where almost every major manufacturer is getting Gold ratings, we need to change the rating process. And it's easy to complain about something, so below I offer my suggestion for fixing the Hot List.

First, ditch - ok, I give in - reduce the Innovation factor to 5%. Split Performance up into categories for Distance (15% - we know how lofts get tweaked now to make clubs longer so let's don't over-value this piece), Workability (30%) Forgiveness (25%), and Playability from Varying Ground Conditions (5%). The last one is meant to test the clubs out of sand, hardpan, wet conditions, rough, etc. And 20% for Look, Sound and Feel.

I realize the performance categories now account for 75% of the score - but is that a problem? When I'm looking for a club, what I really care about is how it performs. And I would also suggest at least considering not lumping all the testing together. It's not exactly clear how much they segregate the higher handicap reviews and tests from the scratch golfers, but if I'm looking for a game improvement iron, I'd like to know it was tested by people with my handicap. And it's probably less useful for a 25handicapper to rate workability when what they need out of a club is probably the forgiveness element more than trying to work the ball.

I think in the beginning, the Hot List had a lot more benefit than it does today. Whether it was pressure from the big manufacturers (and advertisers) to always get Gold ratings or not, it seems like it's gotten harder to NOT get a Gold rating. And that's totally opposite of how it should be. I would expect most - or many at least - to be Silver with only the best earning Gold. And over time, that's gotten skewed. So these were my suggestions for how to fix the Hot List and make it hot again.

I have not been very

I have not been very impressed with the so called HOT list. I have never been impressed with Golf Digest's club test issue at all. The older Golf Magazine club test issue was Much better in my opinion. They had about 25 golfers test the new clubs and the golfers picked their favorites and gave a score to each club they tested. What made the tests of value was that each tester had his handicap listed, so you could look at his handicap and see what clubs he hit well. This allowed EVERY golfer reading the issue to see what might work for them based on his or her handicap. Today the tests are pretty much useless.
As Scott mentions above, what does "Innovation" have to do with how GOOD a club might be. Look at TaylorMade's SLDR driver as an example. They give it a top rating for this, and the SLDR is NOTHING more than a RE-MAKE of the old Mizuno driver from 10 years ago that had the same movable weight in the sole, just like the SLDR. Fact is the SLDR is just a COPY of the Mizino M600 design, That's NOT innovation, that's copying an old design. Same for the new Callaway driver with the movable sliding weight.
Same goes for LOOK, FEEL and SOUND. This is all a matter of TASTE, and nothing else. What YOU like may NOT be what the TESTER likes, so it have ZERO value to anyone other than the tester, NOT the buyer. Some guys like Blondes, some like Redheads. This doesn't mean one is better than to other, It's just a matter of Taste, NOT Quality.


Putting is easy if you have the Right Putter.