I thought I'd do some back-of-the-envelope calculations (seriously, I found a spare envelope and sketched this out) to illustrate the point. Let's assume that any given fielder has a certain throwing accuracy, which we can measure in degrees. To visualize this, put your nose up to the circle at the center of this protractor:
Our target is at 90 degrees. If a player's accuracy is + 2 degrees, that would mean that, under stressful, rushed conditions, that player would be able to routinely make a baseball travel within a "window" that is between 88 and 92 degrees. It's a very small window, but Major League Baseball players are pretty amazing people.
The longer the distance traveled, however, the more a minor angular error in the trajectory he fires a baseball will result in a ball sailing away from his target. If we take an outfielder who is 270 feet from home plate, lobbing a baseball at a runner heading home, small angular errors can result in the ball missing the plate by quite a bit:
At 270 Feet | |
Angular Error (Plus/minus Degrees) | Misses Home Plate By (Feet) |
0 | 0.0 |
0.5 | 2.4 |
1 | 4.7 |
1.5 | 7.1 |
2 | 9.4 |
2.5 | 11.8 |
3 | 14.2 |
This is where the cutoff man comes into play. Let's assume (as David did in his article) that the cutoff man is 90 feet from home plate, rather than 270 feet. Let's further assume that the average MLB fielder's throwing accuracy, under pressure, is +2 degrees. This is a guess, but it seems sort of reasonable. Here's what happens (graphic is showing a blimp's-eye view of the field):
Therefore, not only is hitting the cutoff man allowing a more direct route (lower launch angles negate the time it takes the infielder to catch, transfer, and throw again), and not only does doing so help prevent the hitter from advancing to second base on a throw home, but it also means that the ball will usually arrive in a better location for the catcher to make a play on the runner than if it was thrown all the way from the outfield.
No comments:
Post a Comment