8th February 2016 (Updated 12th January 2017 and 23rd June 2017)
The World Amateur Golf Ranking (WAGR) announced changes to their ranking methodology on 4th February 2016.
Amongst a few other minor changes the WAGR website news release told us that they are extending their rating period to two years from the current one.
Without wishing to re-tell the story of amateur rankings, which I have to a degree summarized in an Appendix below, I feel this is a hugely disappointing announcement.
There are two main amateur rankings, the WAGR, which is R&A-led and based in St. Andrews, and the Scratch Players World Amateur Rankings (SPWAR), compiled by Fred Solomon in San Francisco. The SPWAR has used a two year ranking period since inception.
Despite having less resources and profile the SPWAR is far more accurate and therefore useful for players, event organisers, coaches and journalists. The WAGR has the credibility through association but is currently doing the amateur game a disservice in my view.
Changes are made infrequently to rankings so to hear that the WAGR has decided to make some but none that eliminate its weaknesses and frankly don’t make it any better is surely a missed opportunity.
To help you better understand my concerns here’s a list of my 10 main criticisms of the WAGR: –
1. The methodology, based on stroke averages per round, is too complicated (at least for me) with Counting Events, Ranking Scratch Scores, Divisors, Bonus Ranking Points and worst of all Participation Points. The SPWAR seems easier to follow with points simply allocated on finishing positions and field quality, although not transparent for each event (just the total is shown).
2. There never has been and there are no plans to introduce points depreciation or ageing in the WAGR. Points earned 12 months ago are currently treated the same as points earned last week. This ‘error’ will now be accentuated even more as the WAGR have stated that they do not plan to “weight recent play more heavily” even though they are moving to two years. In the SPWAR points hold good for 30 days, then age daily to 80% after 90 days, then age in equal daily increments before being lost after two years.
3. With no ageing applied the WAGR often sees more dramatic position swings over time. Oppositely the SPWAR has more smooth and gradual changes because it applies points ageing.
4. Despite having a team of workers the WAGR only covered around 2,500 events in it’s men’s ranking last year – a number that is falling by the way. The SPWAR picks up around 4,000 and as a result is therefore including more players and more accurately tracking them than the WAGR is.~
[~ Since writing this article The 2016 Bonallack Trophy teams have been announced. This weakness within the WAGR has been exposed by some members of the Asia-Pacific team, particularly those competing in Asia. Look at the following rankings (as at mid-February): –
Han-Ting CHIU (Chinese Taipei) – WAGR 856 / SPWAR 319
Jae-Kyeoung LEE (KOR) – WAGR 1078 / SPWAR 162
Variances of this extent should not occur and are undoubtedly a result of Messrs. Chiu and Lee playing in events that the WAGR are ignoring. I know it must be hard to decipher some of the scores produced in the Far East but surely the fact the Asia-Pacific Golf Confederation are picking these players suggests their rankings must be understated in the WAGR]
5. The WAGR’s treatment of team matches is another key weakness in their ranking. The WAGR used to ignore doubles matches and just count singles which was odd but nowadays they just give everyone who makes a team the same points regardless of their performance. Just take a look at the WAGR point allocations for the 2015 Walker Cup, Palmer Cup or Home Internationals if you don’t believe me. In the WAGR’s assessment of the Walker Cup Jimmy Mullen [W4/H0/L0] received the same points as Jordan Niebrugge [W0/H0/L3]. That simply can’t be right. Far more sensibly the SPWAR awards points for wins weighted in favour of singles (2) over doubles (1) for each player. At the end of the match all of these player values are added up so that each player’s contribution can be analysed. Only players with a positive total value at the end of a match are considered for SPWAR points. The player with the highest score being deemed the “winner” and awarded most points.
6. The WAGR are generally slow to remove new pros from their list. This is of course critical to the accuracy of any amateur ranking. Between September and February each year this is hard work and unfortunately the WAGR just don’t follow the players, the various qualifying schools and satellite tours closely enough. I bet I can find some pros in their current ranking. The SPWAR seems to deal with these cases far more quickly thus maintaining the robustness of it’s listing.
7. The WAGR is fairly strict on only including 54-hole stroke play competitions. Therefore it ignores all 18 and 36-hole events. So if an amateur does well in say final qualifying for The Open or US Open sectional qualifying he doesn’t get rewarded by the WAGR for it. Amateur golf is varied and the SPWAR shows the necessary flexibility for such high profile competitions. Quickly looking through the SPWAR’s Top 100 you will see most have a ‘short’ event entry where points have been earned.
[See Update 1 below for an interesting development in January 2017 relating to this WAGR requirement.]
8. Weekly announcements may be acceptable in the pro game where virtually every tournament finishes on a Sunday afternoon or evening but in the amateur game it is anything but. Competitions finish on every day of the week. The WAGR is released at 12 noon every Wednesday, the weekly update including events that finish up to and including the previous Sunday. Therefore if an event finishes on a Monday we have to wait nine days for it to be reflected in the WAGR. In the age of instant news and gratification this is unacceptable, particularly for the players. Mr. Solomon updates his SPWAR for all significant events within 24 hours *, often within a few minutes of the final scores being posted. Smaller competitions are sometimes prioritised as less urgent, particularly in the main summer season, but nearly always make it in before the WAGR update.
[* The only exception to this is US College events where the SPWAR uses an adjusted average of the Golfweek and Golfstat head-to-head collegiate rankings (placing more emphasis on events played) to make in-season periodic ranking point allocations – six times a year for Division 1. The WAGR includes events as they occur whereas the SPWAR sometimes lags behind where college results are concerned, although Solomon argues “accuracy trumps timeliness every time”. When it gets to the important end of season Regional and National Championships the SPWAR is of course updated immediately.]
9. Anyone who follows Golf Bible on Twitter or has read my website articles knows I analyse both rankings carefully. I often look at the WAGR and think ‘that ranking doesn’t look right’. I have never looked at the SPWAR and questioned a ranking – the list just makes sense. How can the WAGR be taken seriously when Bryson DeChambeau is not their No. 1. He is 4th at the moment in their listing despite being the reigning US Amateur and NCAA Division 1 champion. That is an achievement of historical proportions yet sadly isn’t reflected in his WAGR ranking status. If you look down both lists you will find some glaring ranking anomalies. In my opinion they always favour the SPWAR when assessed properly (normally for the reasons listed above).
10. The WAGR refuse to include the results of a new event in it’s first year whatever it’s status. I have no idea why. At the end of the day if all the players compete fairly in an event played on a recognised course under the auspices of a respectable governing body then what’s the problem with including it. The most notable recent case being the African Amateur Stroke Play Championship played for the first time in 2016 at the magnificent Leopard Creek GC in South Africa. With all of the leading African players joined by touring parties from Scotland, England, Italy and Switzerland the field was very strong yet it was excluded by the WAGR. The SPWAR incorporated the African Amateur results immediately.
[On there 23rd June 2017 the WAGR announced that they were relaxing “the constraints of probation so that events starting on or after 1st June 2017 will no longer have to go through probation.” One weakness dealt with just a few more to go now.]
The title of this article may be sensational and I know amateur rankings aren’t perhaps a major priority for either the R&A or the USGA but for people that follow amateur golf this is really important.
For me too many people are quoting and using the WAGR without understanding how flawed it really is. I am starting to wonder whether I should ignore the WAGR and just concentrate on the SPWAR ? I have every sympathy for Fred Solomon. He’s working on his own producing a very high quality product but largely being ignored, at least by many people on this side of the Atlantic.
For me the WAGR have missed an opportunity to improve the quality of their ranking product with this latest update. They need to do much better and I think The R&A, who still own the WAGR, need to take more interest in the quality of their output. Let’s be clear the WAGR is easily second best and showing no signs of catching up its rival, the SPWAR. That’s not normally something I associate with The R&A, an organisation I have the highest regard for.
UPDATE 1 – 12th January 2017
There are a number of 36-hole competitions for women in England throughout the season. Unfortunately none of them have historically counted towards the WAGR which refuses to acknowledge events that are less than 54-holes in length.
Whilst a sensible approach for England Golf it is clearly contrived and a little sad that the WAGR could not have been persuaded to review this weakness in their current approach.
I say contrived because the dates of the existing 36-hole events will not change and a combined leaderboard will have to be created to show overall results for these newly paired tournaments. The oddest new event on the 2017 list being the Frilford Jackson, which combines the Frilford Heath Scratch at Frilford Heath in Oxfordshire on 2nd June with the Bridget Jackson Bowl, being played 90 miles away in Handsworth, Warwickshire on 4th July.
APPENDIX: RANKING NOTES
The World Amateur Golf Ranking (WAGR) was launched on 23rd January 2007. It was established by David Moir, a member of staff in The R&A’s entries department.
Its origins lie in the handicap balloting out of the reigning Australian Amateur champion, Andrew Martin, at the 2004 Amateur Championship. It was clear that because of different handicapping systems around the world the adoption of lowest handicap as the primary entry criteria was no longer appropriate. A new approach or safety net was needed to ensure that such errors were not made again and that playing fields were always at their strongest.
Andy McDonald took over from Moir in 2008 and still heads up the team within R&A Championships Ltd that manages it. He now has a team of five people working for him, at least according to an article in the 2014 Golfer’s Handbook.
2011 was an important year for the WAGR. It started to produce a Women’s ranking but more importantly gained the endorsement of the USGA, giving it credibility around the world.
In 1999 Fred Solomon, a scratch golfer and pensions executive from San Francisco, established the Scratch Players Group with some friends. They planned to create a tour for elite golfers, amateur and pro, providing assistance with hosting tournaments ambitiously around the world.
In 2002 Solomon started to contemplate putting together a world amateur ranking to support their work. However, it was not until February 2004 that work started on the Scratch Players World Amateur Ranking (SPWAR).
After compiling and testing his list in 2005 and 2006 Solomon launched the SPWAR on the internet on 13th January 2007. This was 10 days before the WAGR so was the first to be released.
Solomon sought to gain the buy in of the USGA to his ranking which quickly became popular with event organisers in the United States. To his disappointment after some delay the USGA decided to endorse the WAGR at their annual meeting in February 2011. Solomon argues that the SPWAR was superior at all times prior to, at the time of and since this decision was made. I am sure he accepts that on relationship grounds alone the USGA probably had no choice.
Fred Solomon continues to work on the SPWAR alone and with out recompense. The “Gold Standard” as he calls it male-only SPWAR is, to be fair, generally accepted as being far superior to the WAGR. Despite the USGA’s support interestingly virtually all of the non-USGA events in the USA use the SPWAR exclusively or as their dominant entry criteria ranking.
Copyright © 2016-17, Mark Eley. All rights reserved.