The 2020 Elimination Final between the Saints and the Bulldogs had the score line:
STK 10.7 (67) def. WB 9.10 (64).
The Dogs had 2 more scoring shots, and surely if they were a little bit more accurate they would have won! After all, if they had converted one of those behinds it would have turned the game. Looking deeper shows that this is a little naive; and we can do better!
A broader consideration could take into account how difficult each scoring shot taken was – compared to the performance of an average league player taking a similar shot.
If the Dogs kicked as accurately as a completely average player, their “expected” score would have been about 69 points from their 19 scoring shots, meaning that they underperformed and should have beaten St Kilda’s 67. However, the Saints also underperformed and should have got a whopping 79 points from their 17 scoring shots.

Although both teams actually kicked poorly, St Kilda had taken their scoring shots from much “higher percentage” locations.
This game has been deliberately cherry picked as an example that raw goalkicking accuracy (goals vs behinds) can be a little deceptive and not the whole story. Not every game is decided by good/bad goalkicking, but seasons can definitely be shaped by it.
Team | Exp. Pos | Exp. Pts | Exp. % | Actual Pos | Actual Pts | Actual % | Excess Pts |
---|---|---|---|---|---|---|---|
RICH | 1 | 64 | 116 | 3 | 64 | 113.7 | 0 |
GEEL | 2 | 60 | 127.9 | 1 | 64 | 135.7 | 4 |
BL | 3 | 56 | 116.7 | 2 | 64 | 118.3 | 8 |
COLL | 4 | 56 | 115.1 | 4 | 60 | 117.7 | 4 |
PORT | 5 | 54 | 107.5 | 10 | 44 | 105.4 | -10 |
HAW | 6 | 52 | 106.1 | 9 | 44 | 108.7 | -8 |
GWS | 7 | 48 | 115.7 | 6 | 52 | 115.4 | 4 |
WB | 8 | 48 | 109 | 7 | 48 | 107.2 | 0 |
WCE | 9 | 48 | 102.4 | 5 | 60 | 112.5 | 12 |
SYD | 10 | 46 | 98.6 | 15 | 32 | 97.7 | -14 |
NMFC | 11 | 40 | 101.3 | 12 | 40 | 99.5 | 0 |
ESS | 12 | 36 | 87.8 | 8 | 48 | 95.4 | 12 |
MELB | 13 | 36 | 87 | 17 | 20 | 78.6 | -16 |
ADEL | 14 | 34 | 96.6 | 11 | 40 | 100.9 | 6 |
FRE | 15 | 34 | 92 | 13 | 36 | 91.9 | 2 |
STK | 16 | 32 | 92.1 | 14 | 36 | 83.9 | 4 |
CARL | 17 | 30 | 83.6 | 16 | 28 | 84.5 | -2 |
GCFC | 18 | 18 | 63.8 | 18 | 12 | 60.5 | -6 |
In the 2019 season, if every team’s goalkicking performance was exactly average, West Coast would have won 3 less games and missed the finals. Essendon would also have missed, and the wasteful Power and Hawks would’ve landed in positions 5 and 6 respectively.
But how useful is this? It’s, of course, not a certainty that every team SHOULD score as an exactly average set of goalkickers would.
Expected score could be used to tell us if individuals, or whole teams, reliably (with statistical significance) kick better or worse than average. It could also tell us if teams are attempting a lot of low-percentage shots, or if they are working at finding better scoring opportunities. Consequently, it could also tell us if some teams are consistently restricting teams better by conceding more low-percentage shots, or they are giving up easier shots more readily.
Today, however, I’ll just be focusing on individuals’ goalkicking performances.
What is expected score?
Expected score is a metric used in a number of different sports to evaluate quality of scoring opportunities. At a high level, the metric uses relevant historical data (i.e. all scoring opportunities in the league in the last 5 years) to produce a model that can predict the probability of scoring success under a given set of circumstances (distance and angle to goal, defender location, etc.), based on the underlying data. Expected score is well explored in association football (soccer) (expected goals, “xG”) and some other analytics-rich sports (i.e. ice hockey). In these sports, where total scoring is relatively low, expected score can give a “better” measure of scoring opportunities compared to just goals and total shots.
In Australian rules football (footy), a similar metric can be used. In some ways, it is more statistically relevant in footy compared to say, soccer, as footy games have a high number of scoring shots at higher scoring probabilities. For my expected score model, I take into account the following parameters:
- Distance from goal
- Angle from goal
- Subtended angle (how wide the goal posts appear from the shot location)
- Context of shot (i.e. set shot, free shot, pressured shot)
The technical details of the model are fairly close to Figuring Footy’s model, with a few minor differences not worth going into here (note: his model is probably better than mine). Robert did a lot of work on this years ago, before he was swept out of amateur analytics into the real world of club land. It’s worth exploring what he was able to do if you are interested in this area. I’ll be doing some similar analysis pieces to him but with hopefully a slightly different slant.
The choice of the above parameters is mostly limited by data availability. As an amateur I have to rely solely on what data is available in the public space. This sets some limitations that can’t be overcome, but nevertheless the model is still useful. The main limitation is that shot location data is only available for scoring shots. Consequently, shots that miss (out on the full, fall short, etc.), and rushed behinds do not enter the analysis, and these shots do not generate any expected score. Expected scores for scoring shots are therefore between a value of 1 (0% chance of a goal) and 6 (100% chance of a goal). This means every result below effectively assumes each subject misses shots at an exactly average rate (about 20%) – shanks out on the full from 20m out straight in front are not registered!
Some other things that could be considered include the prevailing weather/ground conditions, the individual’s goalkicking ability, crowd effects and game context (early/late in game, close game or blowout, etc.). While it would be possible to account for many of these, this has the potential to partition data so much that each individual scenario has too few historical precedents to produce a meaningful estimate of performance.
The AFL data custodians, Champion Data, produce their own expected score metric (published in the Herald Sun). While I don’t have the full details of their model, they are able to take advantage of more data available, including non-scoring shots, pressure rating, kicking skill selected, etc. My results are different to theirs but broadly show the same patterns. Curiously CD’s expected scores always seem to be higher, but I don’t have the database to check whether this is valid or not (over all the data, total expected score should equal total actual score).
Is goalkicking just dumb luck?
This is a very important question. To determine if a player is a good or bad goalkicker requires a proper statistical analysis. Such an analysis would effectively determine the chance that a completely average goalkicker would perform as good (or as bad) as the subject. We can never be completely certain if this is true, we can only protest vs. reasonable doubt. What “reasonable doubt” is as a value is up for debate.
For the analysis below, I will quote a “sig (%)” value for each subject. This is the percentage chance that they are actually better (positive) or worse (negative) than the average goalkicker, based on the statistical analysis (using Poisson binomial trials, for those playing along at home). For example, a value of “-84%” means that I am 84% certain the subject is a worse goalkicker than average.
For each player in the 2021 season so far (up to but not including the R13 Queen’s Birthday match between MEL and COL), I have used my expected score model to predict the expected number of goals (xGoals) they should have kicked from their chances, and compared that to their actual performance. I have ordered the below table by the significance of their performance, and also included their significance over the 2018-2020 period. There is a qualification criteria of xGoals>5 to help ensure a sufficiently cromulent data set.
The Sharpshooters
name | goals | xGoals | scoring shots | miss rate (%) | score | xScore | 2021 sig (%) | 2018-20 sig (%) |
---|---|---|---|---|---|---|---|---|
JElliott | 10 | 5.6 | 10 | 0 | 60 | 37.9 | 98.6 | -50.8 |
JJKennedy | 28 | 21.4 | 39 | 15.2 | 179 | 146.1 | 96.8 | 79.7 |
AMcD-Tip. | 27 | 21 | 36 | 5.3 | 171 | 141.2 | 96 | 98.6 |
LFranklin | 25 | 19.6 | 32 | 15.8 | 157 | 130 | 95.8 | 74.0 |
JBruce | 34 | 28.1 | 44 | 15.4 | 214 | 184.6 | 94.5 | 79.6 |
ZBailey | 17 | 12.6 | 22 | 24.1 | 107 | 84.8 | 94.3 | -15.5 |
RGray | 15 | 11 | 19 | 26.9 | 94 | 74 | 94 | 92.8 |
LBreust | 20 | 16 | 25 | 16.7 | 125 | 104.8 | 92.7 | 71.7 |
ELangdon | 10 | 6.8 | 13 | 13.3 | 63 | 46.9 | 92 | -95.8 |
DFogarty | 13 | 9.4 | 17 | 10.5 | 82 | 63.9 | 91.7 | 99.1 |
DMcStay | 11 | 8 | 13 | 7.1 | 68 | 53.2 | 91.1 | -95.9 |
WSnelling | 10 | 7.2 | 12 | 14.3 | 62 | 48.1 | 88.7 | 90.1 |
GRohan | 20 | 16.1 | 27 | 12.9 | 127 | 107.4 | 87.4 | 32.7 |
JJones | 10 | 7.5 | 12 | 7.7 | 62 | 49.4 | 87.2 | -15.6 |
While the “significance %” is the key result, it isn’t quite everything so we should be a little careful with the above results. Jamie Elliott (COL) has kicked 10 straight goals in 2021 to date with zero non-scoring shots, which is exceptional, but he just creeps over the qualification criteria. It’s also worth to point out in the 2018-2020 period, he was exceptionally average.
Someone like Robbie Gray is quite likely a good shot, but his miss rate (non-scoring shots) of 27% is quite high (recall that non-scoring shots are not accounted for in the expected score measure) means we may need to be a little careful. However, in 2018-2020, Robbie was again significantly better than average, confirming his good performances as consistent – this is the same for the other bolded names above too.
Surprisingly, a couple of names in there (red) were actually significantly bad goalkickers over the 2018-2020 period!
The Shankers
name | goals | xGoals | scoring shots | miss rate (%) | score | xScore | 2021 sig (%) | 2018-20 sig (%) |
---|---|---|---|---|---|---|---|---|
NFyfe | 5 | 13.4 | 22 | 15.4 | 47 | 88.9 | -99.9 | -56.1 |
JBattle | 3 | 6.5 | 10 | 28.6 | 25 | 42.4 | -97.7 | 26.3 |
TJLynch | 18 | 24.4 | 40 | 7 | 130 | 162.2 | -97.5 | -45.9 |
ANaughton | 29 | 36.5 | 55 | 12.7 | 200 | 237.6 | -97.4 | -69.0 |
MKing | 16 | 22.6 | 38 | 15.6 | 118 | 150.8 | -97 | -83.2 |
LDavies-Uniacke | 3 | 6.5 | 11 | 8.3 | 26 | 43.7 | -96.3 | 2.9 |
SHiggins | 2 | 5.3 | 10 | 33.3 | 20 | 36.5 | -94.8 | 3.4 |
SBerry | 5 | 8.2 | 14 | 17.6 | 39 | 54.8 | -92.3 | N/A |
IHill | 10 | 13.7 | 22 | 35.3 | 72 | 90.7 | -90.8 | -65.1 |
MFrederick | 5 | 8.3 | 16 | 0 | 41 | 57.5 | -90.2 | -61.9 |
LDahlhaus | 4 | 6.5 | 11 | 15.4 | 31 | 43.7 | -90 | -45.5 |
JLukosius | 3 | 5.6 | 11 | 26.7 | 26 | 38.8 | -87.9 | -25.2 |
SSwitkowski | 4 | 6.5 | 11 | 0 | 31 | 43.5 | -87.1 | -15.9 |
LMcNeil | 6 | 8.7 | 14 | 17.6 | 44 | 57.3 | -85.4 | N/A |
It’s no surprise to see Fyfe up there along with a few other names having goalkicking troubles this year. Naughton has had a solid scoring output this year so far with 29 goals but should be a fair way ahead of that!
Over a longer, previous period (2018-2020), this year’s bad goalkickers were still bad for the most part, but just not nearly as significantly so.
In the last couple of weeks, Jack Higgins has been put through the wringer for failing to convert, but it turns out he really isn’t doing that bad (across the season so far), having kicked 17 goals from an expected 17.6.
Where to from here?
There’s a lot more slicing and dicing that can be done with this data, and I hope to explore more in time. What we can see from a few basic results is that goalkicking is a skill in that some players are significantly better than average over long periods.
Most players cannot be shown to be significantly better or worse than average over a long period. This suggests that either most players are pretty much average and luck plays a big role, or players go in and out of form over shorter periods.
Next time, I’ll look at teams as an entity, and how the produce scoring opportunities, and what scoring opportunities they concede.