New Golf Digest 2019 Top 100 in USA Rankings are out!

2

Comments

  • raynorfan1raynorfan1 Members Posts: 3,541 ✭✭

    raynorfan1 wrote:


    I'm not a 5 handicap, but I think it's a fair requirement in trying to build a pool of 2000+ raters. They need to have some proxy of general golf knowledge, and there probably isn't a better one than handicap.



    Courses are also all rated from the back tees, by definition, and how difficult they are is a key metric; so having better golfers as raters is rational.




    I just don't agree. Lots of 10 cappers are more in touch with golf than low cappers. Just because you have the skill to play golf doesn't mean you have the skill to evaluate courses.



    And rating a course is 100% subjective, so why do you want only the opinion of low cappers? If 80% of the golfers are above a five, then why have the minority rate courses for the majority?



    I can rate a course from the back tees just as well as a 5 capper. And why should a course rating be based on tees that 95% of us never play?



    Maybe a better proxy is how many times you play per year.




    There's no doubt that it's an imperfect method. And there are certainly 10 handicaps who are far more "knowledgeable" than some 5 handicaps. However when you're trying to build a big sample, and you don't have the resources to interview all the applicants, it's not a terrible cut-off.



    Interestingly, from a math perspective, 1 in 100 5 (or better) handicaps are Golf Digest "panelists" - way more than I would have thought (they have ~2000 panelists, there are 2 million active handicaps, and 10% of people with handicaps are <5).



    You are, of course, welcome to quibble with how Golf Digest has set up the ratings...if nothing else, they're hoping to generate that buzz...
  • Roadking2003Roadking2003 AustinMembers Posts: 5,307 ✭✭
    raynorfan1 wrote:


    raynorfan1 wrote:


    I'm not a 5 handicap, but I think it's a fair requirement in trying to build a pool of 2000+ raters. They need to have some proxy of general golf knowledge, and there probably isn't a better one than handicap.



    Courses are also all rated from the back tees, by definition, and how difficult they are is a key metric; so having better golfers as raters is rational.




    I just don't agree. Lots of 10 cappers are more in touch with golf than low cappers. Just because you have the skill to play golf doesn't mean you have the skill to evaluate courses.



    And rating a course is 100% subjective, so why do you want only the opinion of low cappers? If 80% of the golfers are above a five, then why have the minority rate courses for the majority?



    I can rate a course from the back tees just as well as a 5 capper. And why should a course rating be based on tees that 95% of us never play?



    Maybe a better proxy is how many times you play per year.




    There's no doubt that it's an imperfect method. And there are certainly 10 handicaps who are far more "knowledgeable" than some 5 handicaps. However when you're trying to build a big sample, and you don't have the resources to interview all the applicants, it's not a terrible cut-off.



    Interestingly, from a math perspective, 1 in 100 5 (or better) handicaps are Golf Digest "panelists" - way more than I would have thought (they have ~2000 panelists, there are 2 million active handicaps, and 10% of people with handicaps are <5).



    You are, of course, welcome to quibble with how Golf Digest has set up the ratings...if nothing else, they're hoping to generate that buzz...




    This is true. It's entertainment. What would we have to debate for hours on end without all of these magazine ratings.
  • Schley Schley Love ya don't tell ya enough! Kingdom of Saudi ArabiaMembers Posts: 1,155 ✭✭

    raynorfan1 wrote:


    I'm not a 5 handicap, but I think it's a fair requirement in trying to build a pool of 2000+ raters. They need to have some proxy of general golf knowledge, and there probably isn't a better one than handicap.



    Courses are also all rated from the back tees, by definition, and how difficult they are is a key metric; so having better golfers as raters is rational.




    I just don't agree. Lots of 10 cappers are more in touch with golf than low cappers. Just because you have the skill to play golf doesn't mean you have the skill to evaluate courses.



    And rating a course is 100% subjective, so why do you want only the opinion of low cappers? If 80% of the golfers are above a five, then why have the minority rate courses for the majority?



    I can rate a course from the back tees just as well as a 5 capper. And why should a course rating be based on tees that 95% of us never play?



    Maybe a better proxy is how many times you play per year.




    I have never been a 5 and probably never will be, but one of GD criteria is very simple:



    2. RESISTANCE TO SCORING-How difficult, while still being fair, is the course for a scratch player from the back tees?



    So they are expanding their term "scratch" to include 5 or less as they need people to actually go and play these courses to get a critical mass of data. We are talking about the best courses, so I like this particular criteria for if it is difficult for the best players, it will certainly be for the average player. A great course should have the teeth to provide a difficult test for championship golfers down to those who aren't. There are many more great courses for the average players, because we don't have to utilize the 7000 plus yard tee boxes so we more easily can check this box than a scratch guy.



    It is subjective, but I acknowledge the best courses should be able to provide a test to the best players AND there are more great courses for average players as a course doesn't need the length for us.
  • az2auaz2au Members Posts: 1,800 ✭✭

    az2au wrote:


    az2au wrote:

    elwhippy wrote:


    Interesting reading as per usual but I would love to know how you get on the rating panel as so many of the courses are members and guests only. I imagine Doak and Fazio could design a crazy golf course and get it ranked looking at the list.


    It isn't that hard. You apply.




    There used to be a link on the Golf Digest website how to apply to become a rater. I don't know if it is still up. It was basically an email link. I have submitted requests to begin the application process at least twice over the past few years. No response whatsoever. Based on my experience, it's rather difficult to become a rater. I couldn't even get the process started.


    If you meet the requirements then it really isn't now though (verifiable 5 index or less, ability to play and rate at least a dozen clubs per year). They doubled the number of panelists a couple of years ago.




    Interesting. So the list should be titled "Top 100 courses for low handicappers and screw 80% of all golfers".


    If you view it that way then it is more like 95% as it has to be verifiable, e.g. tournament rounds or similar, or at least it did when I submitted mine. There has to be some requirement of golf knowledge as raynorfan1 says. Verified handicaps are about as good as it gets and they're not all that great. I know a handful of extremely low/plus index players that couldn't describe even basic elements of course design but, eh, they also wouldn't have any interest in being panelists either.



    It is an imperfect system but that's inherent with subjective ratings to begin with. I think Golfweek has the best system but they turn out awfully similar so Golf Digest can't be too bad. I had a lot of completely incorrect theories about how this worked before I became a panelist but the two biggest were that money influenced things and that it would be "fun" to do. I see the same stuff repeated here every other year and I get it. There's no point in arguing it. I will say that Ron and Stephen are great to work with however and run a tight ship. This is truly Ron's passion and it shows.
  • Schley Schley Love ya don't tell ya enough! Kingdom of Saudi ArabiaMembers Posts: 1,155 ✭✭
    A couple neat facts from the GD site on this year's ranking:



    No. 15 Friar's Head (pictured) and No. 46 Ballyneal are the only two courses that have risen in each of the last five rankings.





    Architect Tom Fazio boasts the most designs in the top 100 (13), including No. 26 Shadow Creek. Pete Dye and Donald Ross are next with nine.





    Gil Hanse boasts the most redesigns in the top 200 with 15, including No. 6 Merion.





    The 1920s produced more 100 Greatest Courses than any other decade with 25, including No. 10 Fishers Island. The 2000s were next with 19.
  • raynorfan1raynorfan1 Members Posts: 3,541 ✭✭
    edited Jan 6, 2019 8:05pm #37
    The flaw in the Golf Digest system is that so much of the ranking that is supposed to be data-driven is actually preordained by expectations. If you take Conditioning as an example, once you get to a certain level (and almost all of the Top 200 courses are there), it just doesn't really get better. There are a couple of courses (La Quinta, apparently) that really go above and beyond on conditioning, and a few (Bethpage) that get a ton of play and can't maintain a perfect standard - but everybody else is basically the same. But when you look at the scores? Voila! They match up almost perfectly with the overall ranking of the course.



    Ambience? Same deal. There are some unbelievable experiences at courses all over the list...yet the scores basically reflect the overall rankings almost perfectly. How is it that Bandon Trails can score 7.9546 in "Ambience" while Pacific Dunes scores 8.423? And that lone half point would either move Bandon Trails up about ten spots or Pac Dunes down about ten spots. If you look at Pac Dunes / Bandon Dunes / Old Macdonald / Bandon Trails scores in Conditioning and Ambience, all follow almost exactly the order in which they are "overall" ranked...but all have effectively identical ambience and conditioning on the ground.
    Post edited by Unknown User on
  • mallratmallrat Members Posts: 2,914 ✭✭
    The fact that Spyglass is above Old Mac which is above Bandon Trails tells me these rankings are doo doo.
  • Roadking2003Roadking2003 AustinMembers Posts: 5,307 ✭✭
    raynorfan1 wrote:


    The flaw in the Golf Digest system is that so much of the ranking that is supposed to be data-driven is actually preordained by expectations. If you take Conditioning as an example, once you get to a certain level (and almost all of the Top 200 courses are there), it just doesn't really get better. There are a couple of courses (La Quinta, apparently) that really go above and beyond on conditioning, and a few (Bethpage) that get a ton of play and can't maintain a perfect standard - but everybody else is basically the same. But when you look at the scores? Voila! They match up almost perfectly with the overall ranking of the course.



    Ambience? Same deal. There are some unbelievable experiences at courses all over the list...yet the scores basically reflect the overall rankings almost perfectly. How is it that Bandon Trails can score 7.9546 in "Ambience" while Pacific Dunes scores 8.423? And that lone half point would either move Bandon Trails up about ten spots or Pac Dunes down about ten spots. If you look at Pac Dunes / Bandon Dunes / Old Macdonald / Bandon Trails scores in Conditioning and Ambience, all follow almost exactly the order in which they are "overall" ranked...but all have effectively identical ambience and conditioning on the ground.




    Great observation! Obviously, the raters are influenced by past ratings. It would be interesting to have some of these raters articulate the conditioning rating differences, or any of the differences.
  • Arb8889Arb8889 Members Posts: 593 ✭✭
    Friars Head will be a staple of the top 10 in the rankings next 20 years. So so good
    [font=georgia,serif]TM M3 460 (12) w/ Fujikura Pro XLR8 X[/font]
    [font=georgia,serif]TM M2 3 Wood HL (16.5) w/ Fujikura Pro XLR8 S
    TM Burner 7 Wood (21) w/ Aldila NVS S[/font]

    [font=georgia,serif]TM P790 4-5 w/ KBS C Taper Lite S[/font]
    [font=georgia, serif]TM P770 6-P w/ KBS C Taper Lite S[/font]
    [font=georgia, serif]TM MG Wedges 52/56/60 w/ KBS C Taper Lite S[/font]
    [font=georgia, serif]TM Spider Tour Black[/font]
    [font=georgia, serif]TM TP5 Ball[/font]
  • mci711mci711 ChicagoMembers Posts: 1,024 ✭✭
    edited Jan 11, 2019 12:07pm #41

    raynorfan1 wrote:


    The flaw in the Golf Digest system is that so much of the ranking that is supposed to be data-driven is actually preordained by expectations. If you take Conditioning as an example, once you get to a certain level (and almost all of the Top 200 courses are there), it just doesn't really get better. There are a couple of courses (La Quinta, apparently) that really go above and beyond on conditioning, and a few (Bethpage) that get a ton of play and can't maintain a perfect standard - but everybody else is basically the same. But when you look at the scores? Voila! They match up almost perfectly with the overall ranking of the course.



    Ambience? Same deal. There are some unbelievable experiences at courses all over the list...yet the scores basically reflect the overall rankings almost perfectly. How is it that Bandon Trails can score 7.9546 in "Ambience" while Pacific Dunes scores 8.423? And that lone half point would either move Bandon Trails up about ten spots or Pac Dunes down about ten spots. If you look at Pac Dunes / Bandon Dunes / Old Macdonald / Bandon Trails scores in Conditioning and Ambience, all follow almost exactly the order in which they are "overall" ranked...but all have effectively identical ambience and conditioning on the ground.




    Great observation! Obviously, the raters are influenced by past ratings. It would be interesting to have some of these raters articulate the conditioning rating differences, or any of the differences.




    The list is a great idea to know what the best overall courses are in the country.. But the actual rankings are bullsh!t. Anyone that believes that a group of rankers came in and had no bias/influence on how they ranked is lying to themselves. Pebble is probably never going to fall from the #1 public course even though there are quite a few better courses. I look at this rankings as just a big list, if anything tiering golf courses would make sense. I do believe 1-25 are probably for the most part are stronger than 26-50 and so on. That is how I look at them at least.



    I don't have a lot of experience on private courses, few good ones in Chicago but that is it. But as far as public goes I have played 15 of the top 25 and when people ask me what are the best it is almost impossible to say. They are all just really good and in that same tier. (gun to my head give me Erin Hills).
  • Roadking2003Roadking2003 AustinMembers Posts: 5,307 ✭✭
    Does anyone have a source for the list in list format (not PDF)?
  • raynorfan1raynorfan1 Members Posts: 3,541 ✭✭
    Arb8889 wrote:


    Friars Head will be a staple of the top 10 in the rankings next 20 years. So so good




    Feels unlikely to me, and not because Friar's Head is anything short of spectacular. There are only 10 spots. Who's vulnerable to being replaced by Friar's Head? Safe to say that none of the current Top 5 have any risk of falling out of the Top 10 in the next 20 years.



    Pebble Beach and Merion will ride their championship pedigrees to safety. Which leaves NGLA, Fisher's Island, and Sand Hills. I could pretty easily see FH bumping Fisher's Island, but it's a pretty tough group to knock out.
  • Roadking2003Roadking2003 AustinMembers Posts: 5,307 ✭✭
    raynorfan1 wrote:

    Arb8889 wrote:


    Friars Head will be a staple of the top 10 in the rankings next 20 years. So so good




    Feels unlikely to me, and not because Friar's Head is anything short of spectacular. There are only 10 spots. Who's vulnerable to being replaced by Friar's Head? Safe to say that none of the current Top 5 have any risk of falling out of the Top 10 in the next 20 years.



    Pebble Beach and Merion will ride their championship pedigrees to safety. Which leaves NGLA, Fisher's Island, and Sand Hills. I could pretty easily see FH bumping Fisher's Island, but it's a pretty tough group to knock out.




    This is true. Like college football rankings, it's easier to stay at the top if you start at the top because humans do the voting and humans have emotions.



    A top ten course has to really screw up to lose their ranking and that's not likely to happen.
  • FairwayFredFairwayFred Sponsors Posts: 4,020 ✭✭
    edited Jan 11, 2019 6:00pm #45
    Golf Digest especially sets their rankings up to make it hard for wholesale change. Though their recent changes to criteria seem like an opportunity for a shakeup.



    The reality is there are a lot of immovable objects on these lists.



    I'd suggest that no new course, no matter how good ever really has a chance to reach #1 on any list other than maybe the Golf Week Modern list.
    FREE AGENT CLUB HO NO MO!
    Ari Techner
    National Custom Works nationalcustomworks.com
    [email protected]
    IG: @nationalcustom
    Twitter: @WorksNational
    (still a huge club HO)
  • irvtrainirvtrain Members Posts: 1,007 ✭✭
    marks21 wrote:

    mci711 wrote:


    Surprised from what I have heard Mammoth Dunes is better than Sand Valley, from just about everyone that has played both.





    Do you have a link?




    It's also interesting GD named Streamsong Black as Best New Course in 2018 over Mammoth Dunes but have MD ranked higher.




    Another panelist here, Best new course evaluations were due by end of August. Whereas all evaluations for this period were due by the end of September. Both of those courses saw a lot evaluations and the scores changed enough to flip them over that month.
  • irvtrainirvtrain Members Posts: 1,007 ✭✭

    raynorfan1 wrote:


    I'm not a 5 handicap, but I think it's a fair requirement in trying to build a pool of 2000+ raters. They need to have some proxy of general golf knowledge, and there probably isn't a better one than handicap.



    Courses are also all rated from the back tees, by definition, and how difficult they are is a key metric; so having better golfers as raters is rational.




    I just don't agree. Lots of 10 cappers are more in touch with golf than low cappers. Just because you have the skill to play golf doesn't mean you have the skill to evaluate courses.



    And rating a course is 100% subjective, so why do you want only the opinion of low cappers? If 80% of the golfers are above a five, then why have the minority rate courses for the majority?



    I can rate a course from the back tees just as well as a 5 capper. And why should a course rating be based on tees that 95% of us never play?



    Maybe a better proxy is how many times you play per year.




    The system to become a panelist is pretty good. Actually you can evaluate the course from the tips without playing the tips. You don't necessarily have to be a 5 cap or less if you belong to a club that will give you access to a lot of courses anyway. A big criteria to becoming a panelist is having played several courses in the top 200 already, so you know what good courses are. I wouldn't discourage you from applying.
  • irvtrainirvtrain Members Posts: 1,007 ✭✭
    Schley wrote:


    A couple neat facts from the GD site on this year's ranking:



    No. 15 Friar's Head (pictured) and No. 46 Ballyneal are the only two courses that have risen in each of the last five rankings.





    Architect Tom Fazio boasts the most designs in the top 100 (13), including No. 26 Shadow Creek. Pete Dye and Donald Ross are next with nine.





    Gil Hanse boasts the most redesigns in the top 200 with 15, including No. 6 Merion.





    The 1920s produced more 100 Greatest Courses than any other decade with 25, including No. 10 Fishers Island. The 2000s were next with 19.




    I can only speak to Ballyneal. They've done a lot to the course over the last 5 years including changing the grass and have a new superintendent. The conditions are significantly better to the point it plays completely different than when I played it 4 years ago.



    Fazio's design seem to have some love among panelists but I think that has more to do with the amount of money the clubs put into the course for not only conditioning but aesthetics as well. For example, I played a Fazio course that has an unlimited budget and my caddie said their budget just for flowers every year is $1M. The other thing Fazio does, is he'll come back to courses often and update the design for the club.
  • irvtrainirvtrain Members Posts: 1,007 ✭✭

    raynorfan1 wrote:


    The flaw in the Golf Digest system is that so much of the ranking that is supposed to be data-driven is actually preordained by expectations. If you take Conditioning as an example, once you get to a certain level (and almost all of the Top 200 courses are there), it just doesn't really get better. There are a couple of courses (La Quinta, apparently) that really go above and beyond on conditioning, and a few (Bethpage) that get a ton of play and can't maintain a perfect standard - but everybody else is basically the same. But when you look at the scores? Voila! They match up almost perfectly with the overall ranking of the course.



    Ambience? Same deal. There are some unbelievable experiences at courses all over the list...yet the scores basically reflect the overall rankings almost perfectly. How is it that Bandon Trails can score 7.9546 in "Ambience" while Pacific Dunes scores 8.423? And that lone half point would either move Bandon Trails up about ten spots or Pac Dunes down about ten spots. If you look at Pac Dunes / Bandon Dunes / Old Macdonald / Bandon Trails scores in Conditioning and Ambience, all follow almost exactly the order in which they are "overall" ranked...but all have effectively identical ambience and conditioning on the ground.




    Great observation! Obviously, the raters are influenced by past ratings. It would be interesting to have some of these raters articulate the conditioning rating differences, or any of the differences.




    Yes and no. Once you play courses in the top 200, you can see the difference between others not in the top 200. To make it in the top 200 you have to have solid scores in all the categories. There are also courses that don't have enough evaluations to qualify for the top 200 that probably would be in there if they did. At the end of the day, there are a lot of really really good courses out there and in order to differentiate one from the other you are splitting hairs at times.
  • raynorfan1raynorfan1 Members Posts: 3,541 ✭✭
    irvtrain wrote:


    raynorfan1 wrote:


    The flaw in the Golf Digest system is that so much of the ranking that is supposed to be data-driven is actually preordained by expectations. If you take Conditioning as an example, once you get to a certain level (and almost all of the Top 200 courses are there), it just doesn't really get better. There are a couple of courses (La Quinta, apparently) that really go above and beyond on conditioning, and a few (Bethpage) that get a ton of play and can't maintain a perfect standard - but everybody else is basically the same. But when you look at the scores? Voila! They match up almost perfectly with the overall ranking of the course.



    Ambience? Same deal. There are some unbelievable experiences at courses all over the list...yet the scores basically reflect the overall rankings almost perfectly. How is it that Bandon Trails can score 7.9546 in "Ambience" while Pacific Dunes scores 8.423? And that lone half point would either move Bandon Trails up about ten spots or Pac Dunes down about ten spots. If you look at Pac Dunes / Bandon Dunes / Old Macdonald / Bandon Trails scores in Conditioning and Ambience, all follow almost exactly the order in which they are "overall" ranked...but all have effectively identical ambience and conditioning on the ground.




    Great observation! Obviously, the raters are influenced by past ratings. It would be interesting to have some of these raters articulate the conditioning rating differences, or any of the differences.




    Yes and no. Once you play courses in the top 200, you can see the difference between others not in the top 200. To make it in the top 200 you have to have solid scores in all the categories. There are also courses that don't have enough evaluations to qualify for the top 200 that probably would be in there if they did. At the end of the day, there are a lot of really really good courses out there and in order to differentiate one from the other you are splitting hairs at times.




    You’re missing the point. Of course it’s splitting hairs all over the Top 100/200. The curious thing is why the hairs almost always split in the same direction.
  • carreracarrera Members Posts: 2,548 ✭✭
    irvtrain wrote:


    For example, I played a Fazio course that has an unlimited budget and my caddie said their budget just for flowers every year is $1M.




    That's insane, even if remotely true (at most clubs the only members who know what is spent around the club is the board, finance committee and green committee). The club must look like old Sentry World.
    Cobra F9 Tour Length - Hzrdus Smoke 70 stiff
    Cobra F9 14.5 - Atmos Blue Stiff
    Callaway Epic Hybrid - Recoil 780 stiff
    Cobra King 3/4 Utility - Recoil 780 Smacwrap F4
    Cobra King Forged TEC Black - Recoil 95 F4 5-PW
    Cobra King Black wedges 54/58 versatile grinds - Recoil 110 F4
    Odyssey 2Ball
    TM TP5X
  • irvtrainirvtrain Members Posts: 1,007 ✭✭
    edited Jan 11, 2019 9:29pm #52
    carrera wrote:

    irvtrain wrote:


    For example, I played a Fazio course that has an unlimited budget and my caddie said their budget just for flowers every year is $1M.




    That's insane, even if remotely true (at most clubs the only members who know what is spent around the club is the board, finance committee and green committee). The club must look like old Sentry World.




    It is probably an accurate figure. This is a course that isn't in the top 200 but probably has the most billionaires as members. Their dues fluctuate between $6K in the offseason and $10K in season. When I was there, it seemed like there were very little members at the course. It was their kids or grandkids.
  • irvtrainirvtrain Members Posts: 1,007 ✭✭
    raynorfan1 wrote:

    irvtrain wrote:


    raynorfan1 wrote:


    The flaw in the Golf Digest system is that so much of the ranking that is supposed to be data-driven is actually preordained by expectations. If you take Conditioning as an example, once you get to a certain level (and almost all of the Top 200 courses are there), it just doesn't really get better. There are a couple of courses (La Quinta, apparently) that really go above and beyond on conditioning, and a few (Bethpage) that get a ton of play and can't maintain a perfect standard - but everybody else is basically the same. But when you look at the scores? Voila! They match up almost perfectly with the overall ranking of the course.



    Ambience? Same deal. There are some unbelievable experiences at courses all over the list...yet the scores basically reflect the overall rankings almost perfectly. How is it that Bandon Trails can score 7.9546 in "Ambience" while Pacific Dunes scores 8.423? And that lone half point would either move Bandon Trails up about ten spots or Pac Dunes down about ten spots. If you look at Pac Dunes / Bandon Dunes / Old Macdonald / Bandon Trails scores in Conditioning and Ambience, all follow almost exactly the order in which they are "overall" ranked...but all have effectively identical ambience and conditioning on the ground.




    Great observation! Obviously, the raters are influenced by past ratings. It would be interesting to have some of these raters articulate the conditioning rating differences, or any of the differences.




    Yes and no. Once you play courses in the top 200, you can see the difference between others not in the top 200. To make it in the top 200 you have to have solid scores in all the categories. There are also courses that don't have enough evaluations to qualify for the top 200 that probably would be in there if they did. At the end of the day, there are a lot of really really good courses out there and in order to differentiate one from the other you are splitting hairs at times.




    You're missing the point. Of course it's splitting hairs all over the Top 100/200. The curious thing is why the hairs almost always split in the same direction.




    I see what you're saying. I guess courses need to have a fine wine element to them and need to stand the test of time. Sand Hills being in the top 10 in such a short time shows it is possible to crack it.
  • FairwayFredFairwayFred Sponsors Posts: 4,020 ✭✭
    edited Jan 11, 2019 9:36pm #54
    irvtrain wrote:

    raynorfan1 wrote:

    irvtrain wrote:


    raynorfan1 wrote:


    The flaw in the Golf Digest system is that so much of the ranking that is supposed to be data-driven is actually preordained by expectations. If you take Conditioning as an example, once you get to a certain level (and almost all of the Top 200 courses are there), it just doesn't really get better. There are a couple of courses (La Quinta, apparently) that really go above and beyond on conditioning, and a few (Bethpage) that get a ton of play and can't maintain a perfect standard - but everybody else is basically the same. But when you look at the scores? Voila! They match up almost perfectly with the overall ranking of the course.



    Ambience? Same deal. There are some unbelievable experiences at courses all over the list...yet the scores basically reflect the overall rankings almost perfectly. How is it that Bandon Trails can score 7.9546 in "Ambience" while Pacific Dunes scores 8.423? And that lone half point would either move Bandon Trails up about ten spots or Pac Dunes down about ten spots. If you look at Pac Dunes / Bandon Dunes / Old Macdonald / Bandon Trails scores in Conditioning and Ambience, all follow almost exactly the order in which they are "overall" ranked...but all have effectively identical ambience and conditioning on the ground.




    Great observation! Obviously, the raters are influenced by past ratings. It would be interesting to have some of these raters articulate the conditioning rating differences, or any of the differences.




    Yes and no. Once you play courses in the top 200, you can see the difference between others not in the top 200. To make it in the top 200 you have to have solid scores in all the categories. There are also courses that don't have enough evaluations to qualify for the top 200 that probably would be in there if they did. At the end of the day, there are a lot of really really good courses out there and in order to differentiate one from the other you are splitting hairs at times.




    You're missing the point. Of course it's splitting hairs all over the Top 100/200. The curious thing is why the hairs almost always split in the same direction.




    I see what you're saying. I guess courses need to have a fine wine element to them and need to stand the test of time. Sand Hills being in the top 10 in such a short time shows it is possible to crack it.




    Sand Hills isn't that new anymore. It opened in 1995. Also it has some cache as the first real middle of nowhere destination club and being somewhat a precursor to the modern minimalist movement. So while it's modern it also has a classic element to it. It's also insanely special I've played nearly the whole Top 100 and it's my #1. It will be very hard for any new course to ever again crack the top 10 imo.
    FREE AGENT CLUB HO NO MO!
    Ari Techner
    National Custom Works nationalcustomworks.com
    [email protected]
    IG: @nationalcustom
    Twitter: @WorksNational
    (still a huge club HO)
  • irvtrainirvtrain Members Posts: 1,007 ✭✭
    edited Jan 13, 2019 11:14am #55

    irvtrain wrote:

    raynorfan1 wrote:


    You're missing the point. Of course it's splitting hairs all over the Top 100/200. The curious thing is why the hairs almost always split in the same direction.




    I see what you're saying. I guess courses need to have a fine wine element to them and need to stand the test of time. Sand Hills being in the top 10 in such a short time shows it is possible to crack it.




    Sand Hills isn't that new anymore. It opened in 1995. Also it has some cache as the first real middle of nowhere destination club and being somewhat a precursor to the modern minimalist movement. So while it's modern it also has a classic element to it. It's also insanely special I've played nearly the whole Top 100 and it's my #1. It will be very hard for any new course to ever again crack the top 10 imo.




    To be honest I'm unsure why courses don't fluctuate a little more in the top 100 than they do. I've only played Pebble and that was before I was panelist. It could also be the limited access, as a lot of courses in the top 100 don't allow us to just come out and play it. So those courses in the top 10 aren't getting a ton of evaluations to cause their scores to fluctuate much.



    edit: To clarify, I played several courses in the top 200 but only Pebble in the top 10.
    Post edited by Unknown User on
  • Roadking2003Roadking2003 AustinMembers Posts: 5,307 ✭✭
    irvtrain wrote:


    raynorfan1 wrote:


    I'm not a 5 handicap, but I think it's a fair requirement in trying to build a pool of 2000+ raters. They need to have some proxy of general golf knowledge, and there probably isn't a better one than handicap.



    Courses are also all rated from the back tees, by definition, and how difficult they are is a key metric; so having better golfers as raters is rational.




    I just don't agree. Lots of 10 cappers are more in touch with golf than low cappers. Just because you have the skill to play golf doesn't mean you have the skill to evaluate courses.



    And rating a course is 100% subjective, so why do you want only the opinion of low cappers? If 80% of the golfers are above a five, then why have the minority rate courses for the majority?



    I can rate a course from the back tees just as well as a 5 capper. And why should a course rating be based on tees that 95% of us never play?



    Maybe a better proxy is how many times you play per year.




    The system to become a panelist is pretty good. Actually you can evaluate the course from the tips without playing the tips. You don't necessarily have to be a 5 cap or less if you belong to a club that will give you access to a lot of courses anyway. A big criteria to becoming a panelist is having played several courses in the top 200 already, so you know what good courses are. I wouldn't discourage you from applying.




    Interesting. Maybe I will apply. I've played quite a few of the top 200.
  • PuttLeftHitRightPuttLeftHitRight Members Posts: 2,261 ✭✭
    Have not seen the 2nd 100 published online yet. Was it only in the mag?
  • adamjstladamjstl Members Posts: 565 ✭✭
    I've been fortunate enough to evaluate course for two of the major publications. The handicap cutoff for Golf Digest is a 5. Personally, I've fluctuated between scratch and a 4 since 2010. Many club pros and assistant pros have NO IDEA that Golf Digest has a handicap limit, and the ones that do think it's 10; not 5.



    Everyone has their own criteria for rating/ranking/evaluating courses- if they do it for someone else, they are simply going to do it by the given guild lines.



    Personally, I've played 19 of the current Top 200. Twelve in the Top 100; seven in the Next 100. The order the magazine ranks them looks very different than my own personal evaluation.
  • adamjstladamjstl Members Posts: 565 ✭✭


    Golf Digest especially sets their rankings up to make it hard for wholesale change. Though their recent changes to criteria seem like an opportunity for a shakeup.



    The reality is there are a lot of immovable objects on these lists.



    I'd suggest that no new course, no matter how good ever really has a chance to reach #1 on any list other than maybe the Golf Week Modern list.




    Very accurate. Friar's Head was #3 on Golfweek's Top 100 Modern by 2014, in contrast to Golf Digest and Golf Magazine's lists, where it very slowly tries to pass the old guard (i.e. country clubs with a long history of championship golf).
  • Luke.SuttonLuke.Sutton ClubWRX Posts: 120 ClubWRX
    adamjstl wrote:



    Golf Digest especially sets their rankings up to make it hard for wholesale change. Though their recent changes to criteria seem like an opportunity for a shakeup.



    The reality is there are a lot of immovable objects on these lists.



    I'd suggest that no new course, no matter how good ever really has a chance to reach #1 on any list other than maybe the Golf Week Modern list.




    Very accurate. Friar's Head was #3 on Golfweek's Top 100 Modern by 2014, in contrast to Golf Digest and Golf Magazine's lists, where it very slowly tries to pass the old guard (i.e. country clubs with a long history of championship golf).




    Golf Digest has Friar’s Head at 19 but with only 1 course (Sand Hills) higher on the list if you only include Modern courses. Augusta is the second newest course on that list built in 1933 that ranks higher than Friar’s Head.



  • raynorfan1raynorfan1 Members Posts: 3,541 ✭✭

    adamjstl wrote:



    Golf Digest especially sets their rankings up to make it hard for wholesale change. Though their recent changes to criteria seem like an opportunity for a shakeup.



    The reality is there are a lot of immovable objects on these lists.



    I'd suggest that no new course, no matter how good ever really has a chance to reach #1 on any list other than maybe the Golf Week Modern list.




    Very accurate. Friar's Head was #3 on Golfweek's Top 100 Modern by 2014, in contrast to Golf Digest and Golf Magazine's lists, where it very slowly tries to pass the old guard (i.e. country clubs with a long history of championship golf).




    Golf Digest has Friar’s Head at 19 but with only 1 course (Sand Hills) higher on the list if you only include Modern courses. Augusta is the second newest course on that list built in 1933 that ranks higher than Friar’s Head.




    I don’t have he list in front of me, but Muirfield Village too, no?
Sign In or Register to comment.