Golf Digest Hot List

JB

Follow @THPGolf on Social Media
Albatross 2024 Club
Staff member
Joined
Oct 8, 2008
Messages
283,896
Reaction score
436,738
Location
THP Experiences
This is the worst one they have ever published. It is so obvious who pays their bills at this point that it is absurd. I mean I understand to an extent why they do the Hot List, but to include half the companies, and then only partial equipment from companies makes no sense. I asked some of the companies if this was on them, and their response was that "GD asked for certain equipment and we send it."

I know a few years back they got "caught" and it came out that it was "more of an ad section" than anything else. I cannot seem to find that info anymore.

Oh well, that is my rant for the day.

And this is just MY OPINION.
 
Did the new GD come in the mail today?
 
  • Thread starter
  • Admin
  • #3
Yes it did.
 
Cool, I hope my mailman comes early today.
 
I can never believe the magazines that publish features like that...similar to if Oprah says she likes something...
 
Oprah's related to Elvis. I learned that today on TV. :tongue-in-cheek-icon:

The Hot List is nice as it puts photos and specs on new equipment all in one place. But it has no credibility for me as a form of buying guide. Rather, if there's something in particular I'm interested in, then it's nice to have the Hot List handy just to look at it; but I don't use it the other way around.
 
I honestly do not think that there is intentional brand pumping. But, unless you provide testers with completely generic looking clubs, personal bias is going to come in. Even then, it's pretty easy to recognize a certain brand's clubs.

More than that, golf clubs are not blenders. They don't work exactly the same for every person. About the only way they could do a completely impartial test is to set up an Iron Byron and a Trackman and provide the data. So why don't they?

It's also annoying that no clubs ever get bad reviews. The spectrum seems to be from "really good" to "superfantastic". I know at demo days I've hit clubs where I thought "That feels like crap." You would think at least one person on the testing team would have the same feeling about a club.

I realize that, when it comes to the major manufacturers, the difference between clubs is razor thin and that quality is usually top-notch. But I want a more in-depth review than "Pros" and "Cons". When it comes to Golf Magazine's "reviews", I have to look at the number of stars because all of the comments and the Pros/Cons look identical to me. And more often than not, the comments sound like they were written by the advertising departments of the club manufacturers.

Put a 7 iron on the Iron Byron and show me a chart with distance and dispersion for each model. Set it up for different types of swings, from the perfect path to an inside-out, over-the top and then tell me what happens to the ball.

Do the same thing with wedges and tell me how high the ball goes and how far it spins back.

Get a putting machine and tell me how true the ball rolls, and what happens to balls hit more toward the toe and the heel.

THEN give them to the testers (without telling them the results) and get the subjective input.

But comments (which aren't event attributed to specific testers) from a bunch of people who may, or may not, swing like me doesn't do me any good.

For Pete's sake (who the heck is "Pete" anyway?), didn't Consumer Reports do a scientific style review of balls a few years ago?

(answer: Yes - http://www.consumerreports.org/cro/cu-press-room/pressroom/archive/2006/05/eng0605gol.htm?resultPageIndex=1&resultIndex=7&searchTerm=golf%20balls )

Shouldn't a major golf-specific magazine be able to do the same thing for clubs, balls, gloves, etc.?
 
  • Thread starter
  • Admin
  • #9
Harry,
WHile I agree that they are not brand conscious, I find it odd that some brands get ALL their clubs in and other brands get 1 or so. Either do all of each of the big brands or 1-2 of each of the big brands.
 
Harry,
WHile I agree that they are not brand conscious, I find it odd that some brands get ALL their clubs in and other brands get 1 or so. Either do all of each of the big brands or 1-2 of each of the big brands.

Although I don't know the full story (obviously), I do know that part of it has something to do with what clubs are submitted by the manufacturers. But regardless, there should be a complete disclosure (i.e., list or better) of every club tested, and why some are featured (i.e., how they arrived at Gold/Silver awards) and why others did not make the cut. Not a lengthy exposition for each club, but at least a brief explanation of how the cut was determined.

Also, here is the text of that Consumer Reports explanation of how they tested balls:


The tests behind the Ratings


CRrobot.jpg

We hired an independent golf laboratory to conduct our launch test. The lab uses a
computerized robot programmed to swing a golf club so it hits the ball in the middle of the club face (or the “sweet spot”) on every swing. In our tests, the robot swung a driver and an 8-iron. It’s the same robot that the United States Golf Association’s Test Center and many manufacturers use to test golf balls.

We used two swing speeds for our driver tests: 90 mph, to represent amateur swingers, and 110 mph, to represent the pros’ rate. We tracked each ball’s flight using equipment based on Doppler radar. It recorded ball speed (the speed of the ball off the club face), launch angle (angle the ball launches off the club face), and spin. As the ball landed, we recorded how far the shot deviated off center, the ball’s carry distance (how far it traveled in the air), and its total distance (including how far it rolled on the ground). External variables such as wind and humidity were monitored and factored into the results. We used one box of each model golf ball, so each brand’s model was hit 48 times.

We also enlisted local teaching and retail golf professionals as panelists to help us determine the feel of each ball. We covered any identifying marks, then asked the panelists to putt each rated ball, along with a very hard and very soft ball for reference points, a distance of 10 feet. Panelists putted each ball eight times, then scored it on a scale of 1 (hardest) to 9 (softest).
 
Although I don't know the full story (obviously), I do know that part of it hs something to do with what clubs are submitted by the manufacturers. But regardless, there should be a complete disclosure (i.e., list or better) of every club tested, and why some are featured (i.e., how they arrived at Gold/Silver awards) and why others did not make the cut. Not a lengthy exposition for each club, but at least a brief explanation of how the cut was determined.

Also, here is the text of that Consumer Reports explanation of how they tested balls:

Harry,
Two manufacturers told me just yesterday that GD asked them for specific clubs.
 
Harry,
Two manufacturers told me just yesterday that GD asked them for specific clubs.

Yeah, there outta be a page that discloses all that stuff. Even if it is because GD only tests the best selling clubs or whatever. Just tell us. No one expects GD to be able to test every single club model sold, but if they don't disclose the methodology, it promotes suspicion.

I know Titleist issued a statement a few years ago because they refused to provide clubs for someone's test (either GD or GM), so they weren't on the lists.
 
I know Titleist issued a statement a few years ago because they refused to provide clubs for someone's test (either GD or GM), so they weren't on the lists.

Very interesting. Did they say why? Because they didn't want to be part of these controversies?
 
Very interesting. Did they say why? Because they didn't want to be part of these controversies?

I tried to find a copy of the Press Release, but couldn't.

Essentially, the Press Release said something along the lines that "Titleist believes that club fitting is critical to finding the club that works for a player and we don't believe that sending a set of clubs not fitted to a specific tester will provide a good basis for an accurate review."

Who knows what the real reasons were though.
 
Wow, Harry--tell us what you really think!
 
Not listed this issue. But if you read GD regularly you know where they stand. Their review was pathetic of all the GPS's they tested.
 
And another nice thread bites the dust.

So sad...so sickening.
 
Sorry for offending (again). I deleted the message.
 
Hot list reviews are very poor

Hot list reviews are very poor

Hi, well i have read the wedge part of the reviews many of times already and i think its a real waste to put this stuff out as worthwhile info. for example with the ping tour w, as a negitive, these grooves wont scuff your ball, they said that people might not like that. :confused2::clown:.
What were your thoughts on the comments in the Callaway X-forged review? nothing about rough or trap test info. The Vokey test did not comment about the new grind or how it played. and again how it plays out of a trap or rough. I have to conceed that this is just advertising trying to steer people towards certain products
 
Jerry,
I agree wholeheartedly as do many of the manufacturers. Its nice to see the pictures and read some blurbs but why not just call it a special advertising section.
 
Guess I missed that one. You only offend me when you ignore me! :at-wits-end:

Yipes! You need to send me a PM when you post to me or I might miss it, for example while I'm engaged over in the arcade or something! :laugh:

The wedge section did suck. There's so many cool things they could talk about with wedges, and so many new mfgs out there as well.
 
Back
Top