As far as I’m concerned, as an 8-9 HDCP, if I’m hitting driver into penalty situations less than 20% of the time it will probably be the correct play. Closer to 10% if its going to be a stroke + distance penalty. This is probably a higher penalty rate than you think.
Any stroke + distance penalty by definition costs 2 strokes, while a lateral hazard usually costs you about 1 stroke - (its actually a little more than 1 stroke since you often have to move a little farther back to find an acceptable drop area).
It’s somewhat ambiguous as to how much an extra yard is worth as it depends on your skill level, and how close to the green you are. The article in the OP implies that dropping back 30 yards costs amateurs on average 0.3 shots or 0.1 shots per 10 yards. However, on the PGA tour this number ranges from 0.025 to 0.065 depending on the yardage (per Mark Broadie’s Every Shot Counts). In that same book Broadie makes the comment that distance matters more for amateurs than it does pros which lines up with the data.
Simple expected value calculations will tell you that if 10 yards is worth 0.1 shots, in order for dropping back 30 yards to be the correct play, you need to reduce the amount of times you hit it OB by 15%, or the amount of times you hit it into a lateral hazard by 30%, or some appropriate combination of both. There may also be a small increase in fairway% but unless the fairway is significantly wider where you’re laying up it doesn’t make have a significant impact on the calculations.
If you’re a tour pro and 30 yards is the difference between being 70 yards and 100 yards away, you’re only gaining about 0.075 shots from distance. Now you only need to reduce OB% by 3.75% and lateral hazard% by 7.5% in order for laying back 30 yards to be the correct play. However, if 30 yards is the difference between 230 and 200, that distance is worth a lot more and you need to reduce penalty% by more for it to be mathematically correct.
In DECADE, this corresponds to comment the idea that there are diminishing returns to getting closer once you get a wedge in your hand. For those familiar with the driving flow chart it is specifically the box that says “it is unlikely you should drop back to 2i/hybrid unless the hole is short and you will be left with a wedge or less”.
That being said, I question how relevant this is for the majority of amateur golfers. I hit the green about 50% of the time from 100 yards. From 50 yards that number jumps to close to 80%. I’m not sure what my expected strokes to hole out are from either distance, but I’d imagine it would be quite a big difference given how rapidly my GIR% increases in this distance range. In fact, I wouldn’t be surprised if this difference in strokes to hole out is greater than what my difference would be going from 100 to 150 yards.
Take a look at the average GIR% on the PGA tour from various distances:
175-200: 53.77
150-175: 63.20
125-150: 69.48
100-125: 74.81
75-100: 78.11
The increase in GIR% gets smaller and smaller as they get closer and closer to the hole. Granted, they’re definitely sacrificing GIR% for proximity at the shorter distances, but I don’t think its a coincidence that the distances that show the greatest changes in expected strokes to hole out coincide with the distances that show the greatest changes in GIR%. Now compare that to my stats from this year (which are admittedly a small sample size, but I still think they get the general point across):
175-200: 22.2%
150-175: 46.4%
125-150: 40.0%
100-125: 50.0%
75-100: 52.4%
50-75: 70.0%