23rd March 2018 – Updated
There are two main amateur rankings, the World Amateur Golf Ranking (WAGR) and the Scratch Players World Amateur Ranking (SPWAR).
The WAGR is run by The R&A and USGA. It is currently based in St. Andrews but is reported to be moving to United States and USGA control in Autumn 2018.
The SPWAR is a labour of love for Fred Solomon, who lives and works in San Francisco.
In summary the WAGR has credibility through its association with golf’s main governing bodies but lacks the clarity and accuracy of the SPWAR.
Here are my 10 main criticisms of the WAGR and the reasons why the SPWAR is better: –
1. The methodology, based on stroke averages per round, is too complicated (at least for me) with Counting Events, Ranking Scratch Scores, Divisors, Bonus Ranking Points and worst of all Participation Points. Take a look at the WAGR Guidance Notes to see for yourself. The SPWAR seems easier to follow with points simply allocated on finishing positions and field quality, although I accept there is a lack of transparency around individual points from each event (just a total for each player is shown).
2. There is no points depreciation or ageing in the WAGR. Points earned 24 months ago are currently treated the same as points earned last week. In the SPWAR points hold good for 30 days, then age daily to 80% after 90 days, then age in equal daily increments before being lost after two years.
3. With no ageing applied the WAGR often sees more dramatic position swings over time. Oppositely the SPWAR has more smooth and gradual changes because it applies points ageing.
4. Despite having a team of workers the WAGR covers significantly fewer events than the one-man band SPWAR and as a result is including fewer players and not tracking them as accurately as the SPWAR does.
5. The WAGR’s treatment of team matches is another key weakness in their ranking. The WAGR simply gives everyone who participates in a team match the same points regardless of the result and individual performances. Just take a look at the WAGR point allocations for the Walker Cup, Arnold Palmer Cup, Home Internationals or Bonallack Trophy if you don’t believe me. Far more sensibly the SPWAR only awards points to those players who have contributed positively to the match. Wins are weighted in favour of singles over fourballs and foursomes and then at the end of the match the total points won by each player are used to determine their SPWAR points allocation.
6. The WAGR are generally slow to remove new pros from their list. This is of course critical to the accuracy of any amateur ranking. Between September and February each year this is hard work and unfortunately the WAGR just doesn’t follow the players, the various qualifying schools and satellite tours closely enough. I bet I can find some pros in their current ranking. The SPWAR seems to deal with Pros far more quickly thus maintaining the robustness of it’s listing.
7. The WAGR is fairly strict on only including 54-hole stroke play competitions. Therefore it ignores all 18 and 36-hole events. So if an amateur does well in say Final Qualifying for The Open or US Open Sectional Qualifying he doesn’t get rewarded by the WAGR for it. Amateur golf is varied and the SPWAR shows the necessary flexibility for such high profile competitions. Quickly looking through the player record of any member of the SPWAR’s Top 100 normally throws up a ‘short’ event entry where points have been earned. The WAGR’s approach has recently led to events pairing up to try and circumvent this rule and achieve WAGR recognition.
8. Weekly announcements may be acceptable in the pro game where virtually every tournament finishes on a Sunday but in the amateur game it is anything but. Competitions finish on every day of the week. The WAGR is released at 12 noon every Wednesday, the weekly update including events that finish up to and including the previous Sunday. Therefore if an event finishes on a Monday we have to wait nine days for it to be reflected in the WAGR. In the age of instant news and gratification this is unacceptable, particularly for the players. Mr. Solomon updates his SPWAR for all significant events within 24 hours *, often within a few minutes of the final scores being posted. Smaller competitions are sometimes prioritised as less urgent, particularly in the main summer season, but nearly always make it in before the WAGR update.
[* The only exception to this is US College events where the SPWAR uses an adjusted average of the Golfstat and Golfweek head-to-head collegiate rankings (placing more emphasis on events played) to make in-season periodic ranking point allocations. For NCAA Division I this is done six times a year; two inputs in the autumn and four through the spring. As the SPWAR spring inputs are added more players are included, rising from the top 275 at the end of February to 692 before the Regional Championships start. When it gets to the important end of season Regional and National Championships the SPWAR is of course updated immediately. As the WAGR includes US College regular season events as they occur it is perhaps better than the SPWAR in this regard. However, the quality of the Golfstat results are such that Mr. Solomon argues that “accuracy trumps timeliness every time”.]
9. Anyone who follows Golf Bible on Twitter or has read my website articles knows I analyse both rankings carefully. I often look at the WAGR and think ‘that ranking doesn’t look right’. I have also had correspondence with many players and parents expressing confusion over their and others WAGR. I have never looked at the SPWAR and questioned a ranking – the list just makes sense – and I have never heard a single piece of criticism of it. If you look down both lists you will find some glaring ranking anomalies. In my opinion they always favour the SPWAR when assessed properly (and normally can be explained by one of the weaknesses listed above).
I know amateur rankings aren’t perhaps a major priority for either The R&A or the USGA but for people that follow amateur golf this is really important.
For me too many people are quoting and using the WAGR without understanding how flawed it really is. Indeed I increasingly am ignoring the WAGR and just concentrating on the SPWAR ?
Let’s be clear the WAGR is easily second best and showing no signs of catching up its rival, the SPWAR. That’s not normally something I associate with The R&A, an organisation I have the highest regard for.
APPENDIX: A BRIEF HISTORY OF THE AMATEUR RANKINGS
The World Amateur Golf Ranking (WAGR) was launched on 23rd January 2007. It was established by David Moir, a member of staff in The R&A’s entries department.
Its origins lie in the handicap balloting out of the reigning Australian Amateur champion, Andrew Martin, at the 2004 Amateur Championship. It was clear that because of different handicapping systems around the world the adoption of lowest handicap as the primary entry criteria was no longer appropriate. A new approach or safety net was needed to ensure that such errors were not made again and that playing fields were always at their strongest.
Andy McDonald took over from Mr. Moir in 2008 and up until late last year still headed up the team within R&A Championships Ltd that manages it. There was a team of five people working for him, at least according to an article in the 2014 Golfer’s Handbook.
2011 was an important year for the WAGR. It started to produce a Women’s ranking but more importantly gained the endorsement of the USGA, giving it credibility around the world.
In 1999 Fred Solomon, a scratch golfer and pensions executive from San Francisco, established the Scratch Players Group with some friends. They planned to create a tour for elite golfers, amateur and pro, providing assistance with hosting tournaments ambitiously around the world.
In 2002 Mr. Solomon started to contemplate putting together a world amateur ranking to support their work. However, it was not until February 2004 that work started on the Scratch Players World Amateur Ranking (SPWAR).
After compiling and testing his list in 2005 and 2006 Mr. Solomon launched the SPWAR on the internet on 13th January 2007. This was 10 days before the WAGR so was the first to be released.
Mr. Solomon sought to gain the buy in of the USGA to his ranking which quickly became popular with event organisers in the United States. To his disappointment after some delay, but presumably not surprise, the USGA decided to endorse the WAGR at their annual meeting in February 2011. Mr. Solomon argues that the SPWAR was superior at all times prior to, at the time of and since this decision was made.
Fred Solomon continues to work on the SPWAR alone and with out recompense. His only reward being that the “Gold Standard”, as he calls it, male-only SPWAR is generally accepted as being superior to the WAGR. Despite the USGA’s support interestingly virtually all of the non-USGA events in the USA use the SPWAR exclusively or as their dominant entry criteria ranking.
Copyright © 2016-18, Mark Eley. All rights reserved.